Why Models Can Never Get Things Quite Right
Their downfall is the assumption that the future will be like the past.
Bloomberg, May 4, 2020
The economy is frozen because of the coronavirus. In all likelihood, it can’t fully reopen until we have proper testing and tracing, an effective vaccine or treatment, or population-wide immunity, none of which may ever happen. The total number of infections and deaths we expect this year and next is still a guess. Epidemiologists have modeled various probabilities, with a wide range of possible outcomes.
Yet, we have placed a great deal of faith in those models and many others because so much of our lives is guided by them. We don’t live in objective reality; in truth, we function in a model of our own construction. Our brains generate mental outlines, continually filling in missing information to form a picture that we can discern and identify. It is a useful evolutionary trait that has allowed us to survive in a world hostile to us soft, chewy creatures without claws, fangs, or armor.
So why is that a problem? Because we fail to recognize that models are “not a report sent back from the future,” according to journalist Jonathan V. Last. Models are constructed of “stuff we know, stuff we think we know, and stuff we have no idea about.” There’s a lot of stuff we think we know about Covid-19 and probably even more that we don’t, which is why the coronavirus models have given such a wide range of possible outcomes in terms of infection mortality.
The standard caveat from statistician George E. P. Box was that “all models are wrong, but some are useful.” Mathematical models can help us make sense of the world, assuming our assumptions are valid and we don’t feed bad data into them. Box reminds us that mathematics creates a shadow of reality, and that the universe is much more complicated than any of our models suggest.
More to the point, every model is flawed because the underlying premise of all of them is that the future will look like the past. Nothing throws a model off more than when “stuff we have no idea about” destroys that fundamental premise.
In many cases, the errors that models produce are unimportant. Netflix has built a very nice business by modeling what you are most likely to enjoy streaming based on your viewing habits relative to what the rest of its 167 million subscribers enjoy. If the algorithms get it wrong, the downside is the service suggests a movie you end up not liking.
Where this become much more costly and dangerous is when we forget these models are imperfect depictions of reality. In 2008, economists, with all of their elaborate mathematical constructs, failed to see the financial crisis coming. None of their models anticipated the rise of nonbank lending and mortgages that eschewed such niceties as credit checks, employment verification and basic loan-to-value ratios. Nor was the role of subprime securitizations in inflating the housing market beyond what incomes were capable of supporting ever folded into the models.
Our faith in financial models should already have been tempered by the collapse a decade earlier of hedge Long-term Capital Management, whose complex models designed by a pair of Nobel laureates never anticipated that Russia would default on its debt.
What are some events that have never happened before that are happening right now?
• The move toward negative interest rates in the U.S. is blowing up the Federal Reserve’s model of inflation and yield. As the old adage goes, if the Fed can’t model something, it assumes it doesn’t exist.
• The Bureau of Labor Statistics has never had to cope with 20% of the workforce filing for unemployment within a month. That surely wasn’t in any of the calcualtions.
• Value investing, the model championed by Benjamin Graham and Warren Buffett, is in trouble because it didn’t anticipate low inflation, low interest rates and enormous Fed market interventions.
• The model for speculating and hedging in oil has come undone. With two of the world’s biggest producers in a price war, a collapse in demand amid the pandemic and a lack of crude oil storage space, futures prices briefly traded at a negative $40 a barrel last month — meaning those holding oil had to pay to have it taken off their hands.
Keep in mind that in terms of the pandemic, we’re not really talking about a black swan event, a term popularized by Nassim Taleb — who by the way doesn’t believe Covid-19 qualifies as an “unpredictable, rare, catastrophic event.” Bill Gates saw it coming back in 2015. So did every pandemic team, including some that the Trump administration disbanded or downgraded. The decision was made to save a few dollars in preparedness, betting that the risk would never materialize.
So in that sense, the models weren’t even needed to help us see the coronavirus catastrophe coming; all that was necessary was some simple, common sense.
Previously:
Forecasting? Wall Street is Wising Up (December 15, 2017)
~~~
I originally published this at Bloomberg, May 4, 2020. All of my Bloomberg columns can be found here and here.