Although dividends have regained a bit of their luster in the eyes of investors recently, many stock market observers continue to chant a mantra that became fashionable in the last half of the 1990s: that low dividend payouts bode well for future profit growth.
As the gurus see it, the traditional emphasis on dividends was shortsighted. By retaining more earnings, companies can enhance productive investment. And investors benefit doubly because their payouts come in the form of larger long-term capital gains, which are taxed at lower rates than dividends.
If that view is correct, then you should expect faster earnings growth and stock-price appreciation to follow periods in which dividend-payout rates are low. Indeed, because payout rates hit historically low levels in recent years and have risen recently due only to the recession's cyclical impact on earnings, many New Economy advocates still see superstrong earnings growth ahead.
Investment strategists Robert D. Arnott of First Quadrant LP and Clifford S. Asness of AQR Capital Management LLC are dubious. In a recent study, they found that over the past century or so, low dividend-payout rates on the Standard & Poor's 500-stock index consistently foreshadowed low average real earnings growth over subsequent 10-year periods. And high dividend-earnings ratios signaled high earnings growth.
Focusing on the period since 1946, they report that on average, real S&P 500 earnings actually declined by 0.7% a year in the decades following years with very low dividend-earnings ratios. By contrast, the 10-year periods following years with very high payout rates boasted average annual gains in real earnings of 3.2%. The authors found that the dividend-payout rate was an even better forecaster of future earnings gains than the price-earnings ratio or the interest-rate yield curve.
Why should high current dividend payouts forecast high earnings growth and low payouts low growth? One obvious reason may be that many managers who see strong earnings growth ahead feel they have room to raise dividends, while low payout rates signal fears that earnings may not be sustainable.
Another reason, say the researchers, may be that many managers who shortchange dividends tend to use retained earnings for unproductive "empire-building activities" that ultimately hurt future earnings. "Companies that retain a smaller share of earnings are likely to choose their investments a lot more carefully than those who decide to forgo dividends," theorizes Arnott.
Whether the low dividend payouts of the late 1990s reflected wasteful or wise investments won't be clear for some years. But the high-tech market bust and the results of Arnott and Asness' historical study are hardly reassuring. Since 1994, nearly 40 U.S. cities have passed living-wage ordinances. Such laws seek to reduce urban poverty by requiring employers doing business with the city to pay their employees a basic wage significantly higher than the minimum wage--which has failed to keep pace with inflation since the 1960s.
Many economists, however, think such policies are self-defeating because raising pay for some workers above market levels will reduce overall employment of low-wage workers. Now, a new study finds such fears to be overblown.
The study is intriguing because it was done by David Neumark of Michigan State University, long a critic of of minimum-wage increases. Neumark analyzed the actual experience of 36 cities with living-wage laws, from Boston, Baltimore, and Chicago to Minneapolis, Denver, and San Francisco.
As he expected, Neumark found that the passage of such laws did tend to reduce employment somewhat among low-wage workers in urban areas. But he also found that living-wage laws had a wider impact on wages than anticipated--especially in cities in which the laws applied both to businesses contracting with the city and to employers receiving government aid in the form of reduced taxes or other goodies.
On balance, says Neumark, the higher wages brought about by living-wage laws appear to outweigh the effects of job losses--resulting in a moderate decline in urban poverty. According to the Tax Foundation, the tax burden of the top 25% of federal income taxpayers grew appreciably in the 1990s, from 77.2% of total taxes collected in 1989 to 83.5% in 1999, while their average tax rate climbed by 2.4 percentage points, to 18.7%. Indeed, the Washington-based research group's analysis of Internal Revenue Service data suggests that the upper 50% of taxpayers paid a bigger share of taxes and a higher average tax rate.
As is often the case, however, the devil is in the details. A closer look at the data reveals that average tax rates paid by income groups other than the highest 10% of taxpayers actually declined a bit. And the bottom 95% of taxpayers shouldered less of the total tax burden than they did in 1989.
The explanation: The rise in overall tax rates for the upper half was driven by a huge jump in income received by the wealthy, especially the top 1% of taxpayers. That lucky group's share of adjusted gross income reported by all taxpayers rose by 37%, to nearly one-fifth of the total. Because of the progressive tax system (and the 1992
Clinton tax hike), such outsize income gains kicked up their average tax rate from 23.3% to 27.5%.
Over the 1990s decade, the average adjusted gross income of the top 1% of taxpayers rose from $420,000 to $915,000. That of the bottom half of taxpayers rose by about $3,600, to $12,450.