Engineers are prone to something called NIH--Not Invented Here--because that is what engineers like to do: figure out how to do something and implement it. They'd like to "Reinvent the world". That's hyperbole, of course, but it's a tendency that their managers try to get them to resist: If somebody has already done it, the current team doesn't need to do any new work and chances are good that the previous, ostensibly successful implementers, did a better job than the current team would. By and large, this is good advice. But not always.
A lot of the time, the old implementation doesn't quite exactly do what the new one needs to. To repurpose the old work, we have to invent a new interface, to be able to fit it into the new work. In software, we call that "glue code", or just "glue". It's rare that glue is simple. If the old work was well designed and modularized, it's at least a few lines of code. It's usually quite a lot more than that, and surprisingly often, it turns out to be more than the code that's being reused. The code that's being reused doesn't have to be tested, says management. Perhaps, but the glue does have to be tested just as hard, and since the fit isn't perfect, the old code does have to be tested. Suppose the old component was 2000 lines long. If the glue is anything up to, say, 300 lines or so and the function really is similar, it's probably worth trying to make the glue work. But if the glue is more than a thousand lines, you're really getting into diminishing returns. It's very likely that reinventing the thing from scratch will work better for the new application, and even if it's 2000 lines long, you've eliminated the need for glue, saving a thousand lines and a bunch of testing.
There are other cases too: sometimes the old implementation wasn't all that well done. Fred Brooks, in The Mythical Man Month, talks about second systems syndrome. When doing the first implementation, it's all they could do to get it to work at all, and they likely were changing algorithms and interfaces all along. It's quite likely the first implementation is kludgey and not too good. The second time, they know better, but now they have a whole bunch of new ideas and they get a whole lot of bloat. On the third try, they're getting closer to the Goldilocks point: not too big nor too little, refined, shaken down, appropriate algorithms, well thought out interfaces. If the code you're trying to reuse is a first or second implementation, it's very likely you're just propagating a bad thing.
Finally, there are a lot of things that are more appropriate being reimplemented. For example, simple searches or insertions that occur at user time. This is such a simple algorithm that nearly everyone who has written the code has done it several times, and won't get it wrong. A new implementation will fit the new codebase perfectly and avoid any mingled source-tree complications. Even if uses totally naive algorithms, it's often better to be simple than optimal, and it's likely that there's no actual algorithmic advantage to be gained (e.g: a bubble sort is so much simpler that it's actually faster than NlogN sorts if there are fewer than a dozen or so things to be sorted)
28 July 2011
23 July 2011
Debt Limit
Suppose you're someone who needs their car for their job--a traveling salesman, for example. Your predecessor has run up ridiculous debts and left the scene. It wasn't your fault, but you're now left holding the bag and you must deal with the situation. One of the changes your predecessor made was to cut your hours worked, hence your income. The credit company has been on your case for months, and now they've announced that they'll send out the collection agency to take your car on August 2nd.
You've got a number of people giving you advice:
Nancy and Harry say you need to buckle down and work more hours, and do what you can to pay off at least part of the debt.
Michelle says the collection agency thing is an empty threat that you can safely ignore.
Eric and John, who incidentally were a big part of the spending spree, and have been whispering that it was you who really caused the problem even though you weren't even with the company when most of it happened, say you need to cut way back on spending, including maintenance of the car, your advertising budget, and canceling your health care. You must not work more hours--in fact, you should work fewer. This is so important to them that they'll willingly bankrupt the company to keep their mid-day tee times.
Barry says you need to both cut back on spending (but not anything that affects income or long term sustainability) and work more hours.
Which advice should you take?
You've got a number of people giving you advice:
Nancy and Harry say you need to buckle down and work more hours, and do what you can to pay off at least part of the debt.
Michelle says the collection agency thing is an empty threat that you can safely ignore.
Eric and John, who incidentally were a big part of the spending spree, and have been whispering that it was you who really caused the problem even though you weren't even with the company when most of it happened, say you need to cut way back on spending, including maintenance of the car, your advertising budget, and canceling your health care. You must not work more hours--in fact, you should work fewer. This is so important to them that they'll willingly bankrupt the company to keep their mid-day tee times.
Barry says you need to both cut back on spending (but not anything that affects income or long term sustainability) and work more hours.
Which advice should you take?
22 July 2011
Display Resolution and Aspect Ratio
In 1994, the "Digital HDTV Grand Alliance" decided upon a new television standard for the whole world. They specified a list of resolutions, most of which have an aspect ratio of 16x9. They chose to allow various aspect ratios for individual pixels, and they did not specify any resolutions past 1920x1080. These definitions were hugely counterproductive, not just for the TV industry but also for the then rapidly growing personal computer industry, as well as being confusing for most users, and completely, utterly, unnecessary.
Between the invention of moving pictures and about 1960, nearly all video had an aspect ratio of 4x3. There was good reason for this: Many things in our environment, from buildings, to sporting events and conversation groups, fit well in the 4x3 frame. When television came along, the movie business felt threatened and in their effort to come up with a more immersive experience, they invented wider aspect ratios. Because of the Cathode Ray Tube display technology in use at the time, TV could not duplicate this. Many movies, but by no means all, embraced the new mode and placed subjects wide apart in the frame. When those movies came to be displayed on 4x3 screens, the incompatibilities were addressed with two bad options: "Pan and Scan", which loses the width of the widescreen cinematography, and "Letterbox", which uses only a small part of the screen to show the whole picture, but in a much reduced window. The movie people were understandably upset by damage done to their cinematography...a problem which was entirely of their own creation.
When digital TV became possible, the same people saw a way out of the box into which they'd painted themselves: they'd make the new TVs support widescreen. These same people managed to dominate the conversation. Never mind that movies are only a small part of TV content and that 4x3 worked very well for almost everything else. Never mind that the new digital technology made it trivial and transparent to support any resolution you chose to transmit in. They knew analog TV hardware and movies, and made a standard with that mindset and only the next 5 years or so in mind. They didn't understand Moore's law, or really anything about computers at all, even those these were central to what they were doing.
What they should have done was standardize two things: the digital data format and square pixels. Period. They should specifically NOT have specified resolution. If I want to watch a lot of movies, I'll buy a display with a wide aspect ratio. If I'm a sportsfan or TV news addict, I'll buy a display with a low aspect ratio. If I want to do the other once in a while, I'll suffer in just exactly the way I have been for movies. But it'll be MY choice, and I'll choose the style that serves me best. The technology to adapt easily is a necessary part of every digital TV display (and personal computer). There's no need to standardize.
The free market is a powerful thing. When higher resolution displays become available, the market will decide which resolutions to sell, not some committee that's stuck in the mistakes they made in the 1950s. The software and hardware to adapt content to match the display (it's called stretching and cropping) is a necessary part of every digital TV and computer display controller. Plugging in a different set of numbers is simple. Adapting content resolution to style, cost and availability of the necessary bandwidth is equally simple. If half the country is watching the superbowl or some major movie, it makes sense to use a lot of bandwidth. If I'm watching a talking head discuss current events, I don't mind if my picture is being transmitted at 320x240. Save the money the bandwidth costs.
It turns out that most people have a hard time understanding resolution and aspect ratio. I'm not sure why this simple thing should be so hard, but it is. But because the HD Alliance chose to provide for both 1:1 and 3:4 pixels, they dramatically increased the confusion. When 4x3 content with 1:1 pixels appear, a great many TVs stretch it to wide aspect ratio, so as to fill the screen--distorting the picture. Moreover, the manufacturers tend to bury the necessary mode controls--I suppose because usability testing found them confusing, and consequently amazingly many people have gotten so used to watching distorted pictures they've stopped noticing. By simply standardizing on 1:1 pixels, the TV could automatically adapt. No user controls would be necessary. Instead, we have various and confusing "wide" "panorama", "4x3", "Zoom", etc., modes.
HD aspect ratio sickness has infected computers, too. In 2003 I bought a laptop with a 1400x1050 (so called SXGA+) display surface. Today, it's fairly challenging to find a laptop with a vertical resolution higher than 768, 73% the size of that 8 year old computer. This is because the market for display surfaces is completely dominated by the TV business--just as in the days of CRTs--and most small TVs are 1280x720. You can buy a monitor with much higher resolution: up to 2560x1600, but even finding out what the resolution of most display surfaces is from the marketing literature or salescritter is surprisingly difficult. I care MUCH less how big it is in inches, than I do how much VERTICAL resolution it has. Nearly all applications and websites have wide menu bands across the top and bottom, dramatically reducing the space available (the window I'm using to type this subtracts 350 pixels from the top of my window and 160 from the bottom--leaving only 258 pixels of usable space on a 768 high monitor. Windows 7 live photo gallery (the app that provoked this flame) steals 200 pixels from the top and 95 from the bottom when in editing mode, 38%. what could be more important in a photo app than being able to get the picture as big as possible?)
Between the invention of moving pictures and about 1960, nearly all video had an aspect ratio of 4x3. There was good reason for this: Many things in our environment, from buildings, to sporting events and conversation groups, fit well in the 4x3 frame. When television came along, the movie business felt threatened and in their effort to come up with a more immersive experience, they invented wider aspect ratios. Because of the Cathode Ray Tube display technology in use at the time, TV could not duplicate this. Many movies, but by no means all, embraced the new mode and placed subjects wide apart in the frame. When those movies came to be displayed on 4x3 screens, the incompatibilities were addressed with two bad options: "Pan and Scan", which loses the width of the widescreen cinematography, and "Letterbox", which uses only a small part of the screen to show the whole picture, but in a much reduced window. The movie people were understandably upset by damage done to their cinematography...a problem which was entirely of their own creation.
When digital TV became possible, the same people saw a way out of the box into which they'd painted themselves: they'd make the new TVs support widescreen. These same people managed to dominate the conversation. Never mind that movies are only a small part of TV content and that 4x3 worked very well for almost everything else. Never mind that the new digital technology made it trivial and transparent to support any resolution you chose to transmit in. They knew analog TV hardware and movies, and made a standard with that mindset and only the next 5 years or so in mind. They didn't understand Moore's law, or really anything about computers at all, even those these were central to what they were doing.
What they should have done was standardize two things: the digital data format and square pixels. Period. They should specifically NOT have specified resolution. If I want to watch a lot of movies, I'll buy a display with a wide aspect ratio. If I'm a sportsfan or TV news addict, I'll buy a display with a low aspect ratio. If I want to do the other once in a while, I'll suffer in just exactly the way I have been for movies. But it'll be MY choice, and I'll choose the style that serves me best. The technology to adapt easily is a necessary part of every digital TV display (and personal computer). There's no need to standardize.
The free market is a powerful thing. When higher resolution displays become available, the market will decide which resolutions to sell, not some committee that's stuck in the mistakes they made in the 1950s. The software and hardware to adapt content to match the display (it's called stretching and cropping) is a necessary part of every digital TV and computer display controller. Plugging in a different set of numbers is simple. Adapting content resolution to style, cost and availability of the necessary bandwidth is equally simple. If half the country is watching the superbowl or some major movie, it makes sense to use a lot of bandwidth. If I'm watching a talking head discuss current events, I don't mind if my picture is being transmitted at 320x240. Save the money the bandwidth costs.
It turns out that most people have a hard time understanding resolution and aspect ratio. I'm not sure why this simple thing should be so hard, but it is. But because the HD Alliance chose to provide for both 1:1 and 3:4 pixels, they dramatically increased the confusion. When 4x3 content with 1:1 pixels appear, a great many TVs stretch it to wide aspect ratio, so as to fill the screen--distorting the picture. Moreover, the manufacturers tend to bury the necessary mode controls--I suppose because usability testing found them confusing, and consequently amazingly many people have gotten so used to watching distorted pictures they've stopped noticing. By simply standardizing on 1:1 pixels, the TV could automatically adapt. No user controls would be necessary. Instead, we have various and confusing "wide" "panorama", "4x3", "Zoom", etc., modes.
HD aspect ratio sickness has infected computers, too. In 2003 I bought a laptop with a 1400x1050 (so called SXGA+) display surface. Today, it's fairly challenging to find a laptop with a vertical resolution higher than 768, 73% the size of that 8 year old computer. This is because the market for display surfaces is completely dominated by the TV business--just as in the days of CRTs--and most small TVs are 1280x720. You can buy a monitor with much higher resolution: up to 2560x1600, but even finding out what the resolution of most display surfaces is from the marketing literature or salescritter is surprisingly difficult. I care MUCH less how big it is in inches, than I do how much VERTICAL resolution it has. Nearly all applications and websites have wide menu bands across the top and bottom, dramatically reducing the space available (the window I'm using to type this subtracts 350 pixels from the top of my window and 160 from the bottom--leaving only 258 pixels of usable space on a 768 high monitor. Windows 7 live photo gallery (the app that provoked this flame) steals 200 pixels from the top and 95 from the bottom when in editing mode, 38%. what could be more important in a photo app than being able to get the picture as big as possible?)
Subscribe to:
Posts (Atom)