14 April 2012

The Laffer Curve

Arthur Laffer didn't invent the concept that bears his name, but he popularized it during the early years of the Reagan administration.  The idea is that as tax levels go up, people have a reduced incentive to do things that might get taxed away.  There's clearly a grain of truth to this: who would work harder for less money?  But is that what any of the proposed or historical tax rates have done?

Let's take an extreme example:  Suppose income less than $100K is taxed at 25% and income over $100K taxed at 90%.   Someone who earned $99,999 after deductions and exemptions gets to take home $74,999.25.   Another person who earned $100001, two dollars more, gets to take home $75,000.10--only 85 cents more than the other person.  Is that a sufficient incentive to not earn the extra money? The higher earner only pays just over a dollar extra on their tax bill--probably not a big deal.  But what if the higher earner had triple the income:  A $300K earner only gets to take home $95K under such a tax structure.  Clearly better income, but the consequences of the high tax rate is significant.  You're only keeping $10K of each extra $100K you make.  It surely reduces the incentive to work harder.

But that's clearly not what's happening, or being proposed.  The highest credible proposal by anyone in the current discussion is a top marginal rate of $39.6%, and that would only be for income over $372,000 (roughly the 99%-1% threshold, b.t.w.).  So you get to keep 60.4% of any income you make over that threshold.  Would you work nearly as hard for that extra money?  Of course you would.

We actually have quite a bit of empirical data on this.  The top marginal rate went up to 63% in for 1932 and 79% in 1936 (for those making over $80M) , and from 1942 through 1963 it varied between 88% and 94%.  Kennedy brought it down to 70% and Reagan to 50% in 1982 and 38% in 1987.   Did people work less hard because of these extreme rates?  There's not one shred of evidence supporting that.  What they did do is find other ways to take their income:  since capital gains are taxed at a lower rate, they found ways to take income in stock or other assets.  And realistically, these extreme rates were mostly symbolic: Henry Kaiser, J Paul Getty, and Howard Hughes paid them, but they also got special business relationships with the government during the war.  It's good propaganda: They could be seen to be doing their part for the war effort.

The evidence is pretty solid:  Tax rates have no effect at all on how hard people work or how much they invest, etc., until the rate gets to about 70%.   Above that level, some people start backing off, and over 90%, nearly everybody backs off.   The Laffer Curve is a real phenomenon, and it might have been relevant to a discussion during the Eisenhower and Kennedy administrations, and even a little bit as late as Reagan.  But the top marginal rate today is less than half the Laffer threshold and nobody is suggesting going much higher than half.

The math is fairly straightforward.  If you increase taxes, you increase government revenue, right up to the point that the Laffer threshold is exceeded.   The optimal thing, financially, is to have the highest rate that does not exceed that.  This probably isn't too good politically, and of course there's always the question of what government might do with the extra money.  But during a time when we're firing teachers and firemen because of severe revenue shortages, that's not really an issue.

No comments:

Post a Comment