In 1994, the "Digital HDTV Grand Alliance" decided upon a new television standard for the whole world. They specified a list of resolutions, most of which have an aspect ratio of 16x9. They chose to allow various aspect ratios for individual pixels, and they did not specify any resolutions past 1920x1080. These definitions were hugely counterproductive, not just for the TV industry but also for the then rapidly growing personal computer industry, as well as being confusing for most users, and completely, utterly, unnecessary.
Between the invention of moving pictures and about 1960, nearly all video had an aspect ratio of 4x3. There was good reason for this: Many things in our environment, from buildings, to sporting events and conversation groups, fit well in the 4x3 frame. When television came along, the movie business felt threatened and in their effort to come up with a more immersive experience, they invented wider aspect ratios. Because of the Cathode Ray Tube display technology in use at the time, TV could not duplicate this. Many movies, but by no means all, embraced the new mode and placed subjects wide apart in the frame. When those movies came to be displayed on 4x3 screens, the incompatibilities were addressed with two bad options: "Pan and Scan", which loses the width of the widescreen cinematography, and "Letterbox", which uses only a small part of the screen to show the whole picture, but in a much reduced window. The movie people were understandably upset by damage done to their cinematography...a problem which was entirely of their own creation.
When digital TV became possible, the same people saw a way out of the box into which they'd painted themselves: they'd make the new TVs support widescreen. These same people managed to dominate the conversation. Never mind that movies are only a small part of TV content and that 4x3 worked very well for almost everything else. Never mind that the new digital technology made it trivial and transparent to support any resolution you chose to transmit in. They knew analog TV hardware and movies, and made a standard with that mindset and only the next 5 years or so in mind. They didn't understand Moore's law, or really anything about computers at all, even those these were central to what they were doing.
What they should have done was standardize two things: the digital data format and square pixels. Period. They should specifically NOT have specified resolution. If I want to watch a lot of movies, I'll buy a display with a wide aspect ratio. If I'm a sportsfan or TV news addict, I'll buy a display with a low aspect ratio. If I want to do the other once in a while, I'll suffer in just exactly the way I have been for movies. But it'll be MY choice, and I'll choose the style that serves me best. The technology to adapt easily is a necessary part of every digital TV display (and personal computer). There's no need to standardize.
The free market is a powerful thing. When higher resolution displays become available, the market will decide which resolutions to sell, not some committee that's stuck in the mistakes they made in the 1950s. The software and hardware to adapt content to match the display (it's called stretching and cropping) is a necessary part of every digital TV and computer display controller. Plugging in a different set of numbers is simple. Adapting content resolution to style, cost and availability of the necessary bandwidth is equally simple. If half the country is watching the superbowl or some major movie, it makes sense to use a lot of bandwidth. If I'm watching a talking head discuss current events, I don't mind if my picture is being transmitted at 320x240. Save the money the bandwidth costs.
It turns out that most people have a hard time understanding resolution and aspect ratio. I'm not sure why this simple thing should be so hard, but it is. But because the HD Alliance chose to provide for both 1:1 and 3:4 pixels, they dramatically increased the confusion. When 4x3 content with 1:1 pixels appear, a great many TVs stretch it to wide aspect ratio, so as to fill the screen--distorting the picture. Moreover, the manufacturers tend to bury the necessary mode controls--I suppose because usability testing found them confusing, and consequently amazingly many people have gotten so used to watching distorted pictures they've stopped noticing. By simply standardizing on 1:1 pixels, the TV could automatically adapt. No user controls would be necessary. Instead, we have various and confusing "wide" "panorama", "4x3", "Zoom", etc., modes.
HD aspect ratio sickness has infected computers, too. In 2003 I bought a laptop with a 1400x1050 (so called SXGA+) display surface. Today, it's fairly challenging to find a laptop with a vertical resolution higher than 768, 73% the size of that 8 year old computer. This is because the market for display surfaces is completely dominated by the TV business--just as in the days of CRTs--and most small TVs are 1280x720. You can buy a monitor with much higher resolution: up to 2560x1600, but even finding out what the resolution of most display surfaces is from the marketing literature or salescritter is surprisingly difficult. I care MUCH less how big it is in inches, than I do how much VERTICAL resolution it has. Nearly all applications and websites have wide menu bands across the top and bottom, dramatically reducing the space available (the window I'm using to type this subtracts 350 pixels from the top of my window and 160 from the bottom--leaving only 258 pixels of usable space on a 768 high monitor. Windows 7 live photo gallery (the app that provoked this flame) steals 200 pixels from the top and 95 from the bottom when in editing mode, 38%. what could be more important in a photo app than being able to get the picture as big as possible?)
No comments:
Post a Comment