Sometimes The Best Does Win

So finally, finally, finally, HD DVD is dead. Dead dead dead. And good riddance, too, I say. It was (GoB that feels good to use the past tense) an inferior format in almost every way. Even its karma was fucked: Microsoft and others created the HD DVD standard because it was easier on corporations, not better for consumers.

Some info, just in case you didn’t know, or were misinformed by HD DVD desperados:

  • HD DVD stores only 17GB per layer, which means that right out of the gate, the format lacked the capacity for HD movies, resulting in almost all titles being released as double-layer disks, clocking in at 34GB
  • Blu-ray, on the other hand, has 25GB per layer, meaning there is enough space for full titles plus superior audio, both in quality (uncompressed audio is amazing) and quantity (more languages).
  • Blu-ray has moved up to double-layer disks (BD-50), meaning that bit-rates for video can be much higher (meaning better picture), and even better audio with more language choices
  • HD DVD folks used to claim that HD DVD capacity was 51GB in its triple-layer disks, but those disks were prototypes and no existing HD DVD players would be able to play them anyway, so that’s just damned misleading. Or a damned lie.
  • As for future-proofing, there exist 300GB Blu-ray disks in the prototyping phase, but I’m not sure if any existing players will be able to play them. But then again, you don’t see any Blu-ray advocates pimping a 300GB capacity, either

The media has been the same old selfish culprit in not bothering to understand the stakes and the situation. Just like any computer article that includes Rob Enderle as a source isn’t worth the pixels it’s printed on, any time the media trotted out the old “Betamax vs VHS war”, you should have simply ignored it. This war was nothing like Beta vs VHS, except that Sony was involved.

But Sony was never the sole developer of Blu-ray: the credit there belongs to the BDA, a consortium of the following companies:

<br/> 2008.02.20.bda.gif
<br/> <br/>

Strange, isn’t it, that Universal is a member of BDA and yet it had decided to go HD DVD-only when $150 million suddenly appeared in a plain brown paper back at their doorstep with a free copy of Windows Vista inside? (apocryphal, but Microsoft did kick in the money, supposedly.) Universal is Universally Evil. Of course all of these corporations are in it for themselves and couldn’t give a shit about people (yes, even Apple is included in that list), but Universal upped the hypocrisy to new heights (and by heights, I mean depths) by such statements as “we’re putting the viewing of content in control of the user”, when really, they meant they were removing content from access by iPods and iPhones, making you watch content on a computer and not on a TV, and preventing you from being able to fast-forward through commercials. With such spin, they should be gunning for the Republican primaries.

So yes, Apple is part of the Blu-ray Association, but no Apple hardware includes Blu-ray drives (even though you can buy them from third-parties), and Final Cut Studio, an Apple Pro App for non-linear video editing includes the ability to generate Blu-ray and HD DVD masters. Maybe they were waiting for the dust to settle, or maybe some magic eight-ball settled on a “Not yet” message, I have no idea.

But anyway, maybe this all sounds like more schadenfreude against those who went Red (aka the HD DVD camp), but there are several reasons to celebrate the demise because it rings in some potential improvements for everyone interested in high-defintion titles:

  • All those consumers who’ve been waiting on the sidelines may now commit to high-definition, leading to sale of more players and more titles, bringing the prices down for everyone.
  • No more “least common denominator” disks from Warner Brothers who supported both HD DVD and Blu-ray: to save money, would put the same main title data on both, meaning they’d have to compress the data to fit on a 31GB disk to accommodate HD DVD and now they can use higher bit-rates now that they’re targeting Blu-ray only.
  • AVC (i.e., h.264) seems to be emerging as the standard compression for Blu-ray, and I’ve found that it’s significantly better than VC-1 (i.e., Windows Media), which seemed to be the de facto standard compressor on HD DVD—not surprising, given that Microsoft was in the HD DVD camp and has deeper pockets than God.

For our family’s Secret Santa at Christmas this year, I got the Harry Potter 5-movie special edition on Blu-ray. It looks pretty damned good on my TV; however, the first movie I watched was the fifth one (Order of the Phoenix) and in the first scene, there’s plenty of sky, first a blue sky then a gray, and I could spot in seconds that it was the VC-1 compressor: light blues, light-to-medium grays and all beiges are difficult for any compressor, but VC-1 is terribly noisy, meaning in a light blue sky, you’ll see MUCH darker and MUCH lighter pixels (sometimes groups of pixels), which are different with each frame, making that area look “grainy”. Nerdy, yes, but it can be distracting during scenes in films where you’re supposed to be looking at a serene landscape, for example.

AVC also has some trouble with those same skies, but the noise is far, far less: the darker pixels and lighter pixels are much closer to the true color, and the areas which vary are smaller clusters of pixels and usually square-shaped regions.

Good old MPEG-2 is also one of the compressors used for HD DVD and Blu-ray, just as it is for regular old DVDs, but the bit-rate is much higher (on Blu-ray, it’s usually between 18 and 35 megabits per second and on DVD it’s about 3 to 8 megabits per second).

All of this may seem like minutiae only a geek could love, but overall it does make a huge difference in the experience. And you’d be surprised at how much audio matters: most people think of the audio as an afterthought, but when I switch the audio from standard Dolby Digital 5.1 (compressed) to uncompressed PCM, the entire room opens up broadly and finely, making the content less like watching it and more like experiencing it.

Geeks & Designers, Getting What They Want

In the list of things that are important by generalists, this is a thing that doesn’t even come close to appearing. A shampoo discussion among bald men is more important.

But, Apple released Leopard 10.5.2 today, and addressed something that many people bitched about when Leopard came out: the alpha (transparency) component in the Mac OS X menu bar (remember, Winders folks, the Mac menu bar is at the top of the screen, and is, I believe, the single biggest contributor to users’ increase in productivity vs. Windows).

And you know, I was one of the bitchers, too. It just flew in the face of that notion of productivity item, making it less prominent by blending it with the Desktop Picture (that’d be Wallpaper to you Windows folks). I didn’t like that the Desktop Picture created visual noise. Gruber didn’t like that it bollocksed up the idea of properly anti-aliasing the text of each menu title. Many others just didn’t like the change because it was, well, change. You know, the ironic ones.

Before 10.5.2, which arrived yesterday, and in the absence of third-party hacks to return the menu bar to its “beloved” 100% opacity, the menu bar looked like this:

<br/> AlphaMenuBar.png
<br/> <br/>

And now that you have the option to turn translucency on or off, setting it back to pre-Leopard looks like this:

<br/> OpaqueMenuBar.png
<br/> <br/>

Now, I never went looking for hacks, nor did I modify my Desktop Picture to have a 20px white band across the top of it: see, when you use white as the background in the area under the menu bar, it “reverts” to appearing solid white, but still doesn’t address Gruber’s issue. Apple restoring proper opacity does result in proper anti-aliasing.

So one of the first things I did was go to that System Preference Pane and turn off translucency:

<br/> SysPrefMenuBar.png
<br/> <br/>

I thought I’d feel that little rush of proper design mixed with bittersweet nostalgia. But I didn’t. I went back and thought about it, and the reason that I think—despite the apparent graphical insult and apparent UE injury—the menu bar has been diminishing its importance, instead moving slowly towards a region-of-interest-type user interface. Palettes nearby, contextual menus (which I hate, but they are there), larger displays, etc. This is completely a personal choice, and it seems like everyone would disagree with me, but I went ahead and set the menu bar back to translucent.

If all this sounds anal-retentive, well, it’s this kind of attention to detail that helps make a Mac a Mac: sustain the illusion of context and activity and try your best to get the UI out of the way of a user’s goal: the best UI is the one that never enters the user’s conscious thought, shattering the illusion.

So after wondering what the hell they were thinking, I find myself wondering why I hadn’t thought of that.