honekamp.net

Standardization Woes

For many people, Markdown is the front and center of their textual workflows, and rightfully so.

Comes Jeff Atwood and says that Markdown needs standardization to become even more useful because, come standardization, 15 Markdown processors wouldn’t create 22 different results of the same source (or maybe it was the other way round).

This has caused a lot of discussion, mostly around the naming of the standardization project, less so about the question whether it really makes sense to standardize in the first place.

Personally, I’m not so much interested in trademark discussions. My primary interest is the standardization goal as such. In other words, is there really a point for a standardization effort, given that the tool landscape for Markdown-related tools is arguably thriving.

So, what is the use case for the standardization, what comes out at the other end that substantially benefits the users of Markdown? Or else, could it be that the standardization effort is not so much aimed at the benefit of the users but at the convenience of the creators of Markdown tools?

Dunno, the participation of John MacFarlane (who is the original author and maintainer of pandoc) in the standardization initiative may be an indication that the latter question could be key to understanding the motivations behind the project.

In this article, I’m trying to identify the potential use cases that may be taken as a motivation for a standardization effort.

So, here is a list of use cases that may benefit from the existence of a standardized Markdown along with some personal comments.

Markdown files are exchanged between several users

This case describes a scenario where Markdown sources are shared, or exchanged between a group of authors. I would assume that, in the absence of a standard, it seems totally feasible to agree on a commonly agreed toolchain for the processing of the sources.

To some extent, this is similar to the creation of a software project in C. Especially when it comes to the deployment to an embedded platform it is necessary to come to agreements about the applicable tool chain, whether the underlying language is standardized or not.

For practical purposes, a standard would be nice but far from being indispensible.

Same source is processed in different toolchain for different purposes

Again, I’m not sure.

Granted, I (for example) use different tools that may or may not produce the same output on the same piece of Markdown source. But the thing is that these tools cover different areas and therefore the situation that they would all be taken to process one specific piece of Markdown is not very realistic.

One example for this is the usage of tables. These may (for example) be relevant for creating a blog post or a scientific paper. But, on the other hand, for keeping a daily journal of your activities a table is probably of secondary interest.

Don’t want to remember which markup flavor to use in which tool chain

This is basically the opposite situation to the first use case where many authors work in one tool chain. Here, one author works with multiple tool chains.

In this situation, people may not like the idea to distinctively consider which specific style of markup to use in different authoring tools and post-processing toolchains executed for the different purposes of their writing.

Admittedly, this concern may hit home, or some kind of home. On the other hand, the same situation comes up if you (have to) work in two different flavors of the same programming language, like Python 2.x vs Python 3.x.

And if you like it or not, the existing chasm between different flavors of Python (or Ruby, for that matter) is the result of moving a living standard forward. This may come with breaking changes.

Let’s assume that a standard definition of Markdown exists and that, after some time, this definition needs to go forward to meet future requirements. Then, the probability of Markdown creators ending up in pretty much the same situation as today is certainly non-negligible.

Summary

To summarize, after considering these cases I’m personally still fine with the current situation and the tools available for various purposes. I guess we’ll see what comes out at the other end of this “standardization” effort and whether there’s any benefit for the rest of us.

On a Totally Unrelated Note

Should you find your kid crying because the PS3 suddenly stopped outputting HD signal to the TV hooked up to it, and Minecraft won’t support multiplayer games any longer, here’s what you do1:

While the PS3 is switched off, press and hold the power button. As you start pressing, it will beep shorty (as usual). It will again beep after a couple of seconds and then, again, after another couple of seconds. Release the power button and the PS3 will start rebooting.

Have your controller ready and confirm the reboot. It will ask whether to use the friggin’ HDMI connection to drive the TV. Confirm.

After it’s up again, chances are that Minecraft will support multiplayer mode again.

  1. at you own risk …

My First Weeks With Overcast

I’ve been using Downcast for listening to podcast episodes for such a long time that I hardly remember the point in time when I started. The funny thing is that over time haven taken a lot of similar apps for a spin and none of them stuck.

Now Overcast is out, and I gave it a try.

First things first: It’s a relieve that the new podcast client Overcast supports the manual reordering of playlists. This is a feature I just can’t live without.

Some of the more prominent podcasts clients (Instacast, Pocket Casts) don’t support this which is precisely the reason why I wouldn’t consider naming any of them my preferred podcast client.

Sure, there are other mechanisms provided by these apps to created ordered playlists but it turns out that these mechanisms just don’t work for me the same way that the ability to manually reorder does.

Overcast comes with two distinct sound effects, the ability to simply remove speaking pauses from the episode stream (dubbed as Smart Speed) and Voice Boost (which is, as far as I understand, basically a dedicated equalizer preset that amplifies the typical voice frequencies).

Smart Speed is nice. There are no audible decreases of voice quality and I really don’t mind being able to listen to more content in shorter time. In my experience, there are some shows (for me, the most prominent example is This American Life) where Smart Speed tends to have a negative effect on the experience.

But overall, I can’t help giving Smart Speed a thumbs up.

I’ve had mixed experiences with Voice Boost. There are shows where it just sounds terrible. Only in a few cases I had the impression that the feature actually improves the sound quality. Pro tip: turn the volume down significantly before activating Voice Boost.

The play screen differentiates from the competition in that it features a real-time “spectrum analyzer”. Frankly, I tend to call this a nice idea, wasn’t there this nagging feeling that my battery probably isn’t very excited about that gimmick.

Marco says it’s only active while the the play screen shows. He should know.

I like the way how progress is displayed on a bar rather than by just a thin line (where it is much harder to recognize). The same bar can be taken as a scrubber.

Personally, I don’t really care for Overcast’s discovery features. I’m already subscribed to way more podcasts than I will ever have the chance to actually listen to.

Let’s get back to Downcast for a moment: the only issue I have with app is that it will eat a some portion of the battery life even if it is not playing. Therefore, I usually kill Downcast when I stop listening and launch it again when I want to continue.

The downside is that the app will not be aware of its status before I killed it and so I have to select the episode I want to listen to and start playing.

With Overcast, I have the impression that it is behaving differently in this regard. As far as I can see, there is next to no sizable effect on battery life and that makes indeed a difference for me.

The summary of my first weeks with Overcast is that, despite the occasional bug here and there, I don’t miss Downcast nearly as much as to feel compelled to switch back to it. Something happens that I’d not though possible before I had the chance to get hands on with Overcast.

Ugh, Kafasis

Admittedly, I have never been that much into football. Never played in a team, never even bothered to watch a match live in a stadium (although the nearest Bundesliga pitch is more or less around the corner).

But, frankly, I also don’t publish less … informed … opinions or repeat them on a podcast. No offense, but anybody working in the software business should really know better than trying to fix something that he does not even remotely understand.

Even if it is not about software.

Timing

They say, timing is everything, and mine isn’t particularly good. Or maybe its Silvio Rizzi’s timing, dunno.

Whatever, just as I sung the praise of using a regular web-browser for reading my Feedbin subscriptions, Reeder for Mac comes along as a series of betas.

Admittedly, beta 1 was pretty rough and, at least in my impression, closer to a proof of concept than something I would like to use on a daily basis.

Since the release of beta 3 I’m using Reeder as my regular feed reader and I’ve not looked back to using the browser ever since. Not to mention Readkit, which Reeder simply blows out of the water.

Betas 4 and 5 added further improvements and polish, e.g. sharing was added and also the preference dialog has again been massively improved.

One thing that I’m missing in the current release is the ability to hover the mouse cursor over a link in an article and get some form of displaying the underlying URL. But maybe this feature makes it to a later version. I can imagine that many others would like to see this ability.

Anyway, I can’t wait to get to see the final result.