John Hussman: Stock valuations are ‘wicked’ if you normalize for profit margins

In his Weekly Market Comment out this morning, John Hussman encourages investors to look critically at the data generating process behind any new release – “Data doesn’t just drop from the sky or out of computer. It is generated by some process, and for any sort of data, it is critical to understand how that process works.” He throws some cold water on last week’s new housing starts excitement:

In the end, the data-generating process [for new housing starts] features millions of underwater homes, huge REO inventories, and yet constrained supply. The result is more stable home prices, but a misallocation of capital into new homes despite a glut of existing homes that cannot or will not be brought to market. So starts are up even though existing home sales are down. Inefficient or not, there’s no indication that inventory will be abruptly spilled into the market, so if the slow dribble continues, we’ll probably continue to see gradual growth in housing starts that competes with a gradual release of inventory. This isn’t an indication of economic resilience or housing “liftoff,” but is instead an indication of market distortion and misallocation of scarce capital.

Hussman once again brushes aside claims that stocks are currently undervalued:

According to Ned Davis Research, stock market capitalization as a share of GDP is presently about 105%, versus a historical average of about 60% (and only about 50% if you exclude the bubble period since the mid-1990’s). Market cap as a fraction of GDP was about 80% before the 73-74 market plunge, and about 86% before the 1929 crash. 105% is not depressed. Presently, market cap is elevated because stocks seem reasonable as a multiple of recent earnings, but earnings themselves are at the highest share of GDP in history. Valuations are wicked once you normalize for profit margins. Given that stocks are very, very long-lived assets, it is the long-term stream of cash flows that matters most – not just next year’s earnings. Stock valuations are not depressed as a share of the economy. Rather, they are elevated because they assume that the highest profit margins in history will be sustained indefinitely

And finds an interesting pattern in macroeconomic data against expectations:

On the economic front, careful consideration of the data-generating process provides insight into how “surprises” can emerge in a very predictable way. For example, although short-term economic data isn’t particularly cyclical, the expectations of investors and economists typically swing too far in the direction of recent news, which in turn creates cycles in economic “surprises” because not many periods contain an utter preponderance of only-good or only-bad data. In modeling this process, the same behavior can be produced in random data. The length of the cycle appears to be proportional to the length of the “lookback” period used to determine whether the recent trend of the data is favorable or unfavorable.

Case in point, there’s a perception that the recent economic data has somehow changed the prospects for a U.S. recession. The idea is that while the data has remained generally weak, the latest reports have been better than expectations. However, it turns out that there is a fairly well-defined ebb-and-flow in “economic surprises” that typically runs over a cycle of roughly 44 weeks (which is by no means a magic number). The Citigroup Economic Surprises Index tracks the number of individual economic data points that come in above or below the consensus expectations of economists. I’ve updated a chart that I last presented in 2011, which brings that 44-week cycle up-to-date. Conspiracy theorists take note – the recent round of “surprises” follows the fairly regular pattern that we’ve observed in recent years. There’s no manipulation of the recent data that we can find – it just happens that the sine wave will reach its peak right about the week of the general election.