Forecasting’s Problem is Philosophy, Not Track Record

The Philosophical Failings of Forecasting
No one can foretell the future. But investors who think that they can tend to make predictable errors.
Bloomberg, January 5, 2017




This time of year is peak forecasting season — holiday retail sales, lists of stocks you should buy this year and, of course, market forecasts all keep economists, strategists and analysts busy. I always make time to mock some of the sillier approaches to prediction-making. Indeed, I have been doing this for so long that some pushback has developed against the idea of critiquing the annual forecasting follies.

For today, let’s skip the usual bashing of forecasting; it is too easy. Instead, I want to look at the underlying cognitive and philosophical failings that are associated with the forecasting industry. This context should provide a framework for understanding the problems and investing risks of forecasting.

My interest in the prediction business traces back to William Sherden’s 1998 book, “The Fortune Sellers: The Big Business of Buying and Selling Predictions” (see excerpt). That book makes the point that even 5,000 years ago, “forecasting was widely practiced in the ancient world in the form of divination, the art of telling the future by seeing patterns and clues in everything from animal entrails to celestial patterns.”

Sherden explains how a wide range of prognosticators — consultants, economists, investment advisers and others — have turned the dark arts of foretelling the future into a lucrative profession. They have successfully developed the tools to separate those who badly want to know the unknowable from their money. Given how frequently this involves finance-related professions such as stock analysis, banking, investing, trading and economics, our ongoing interest in the topic should be quite understandable.

As with so many issues that involve money, cognitive problems arise.

People conceive of their own sense of self by making a number of reasonable albeit incorrect assumptions about the world. They often begin by pretending they know what is going on around them. People dislike uncertainty — it forces them to acknowledge how little they actually understand about the complications of the world they live in. Recognizing these limitations is an unpleasant concession to many.

Philosophically, most people don’t like to admit the inherently random nature of life. This manifests itself in a variety of ways, but most typically it involves taking credit for successes but placing blame elsewhere for failures. Success, even though the credit for it should be attributed to coincidence or mere luck, is inevitably followed by overconfidence. Too often, what comes next are self-inflicted errors.

Believing in predictions allows people to overlook their own ignorance, discount the role of randomness and generally overestimate their own skills. If you think you (or someone you pay) can divine the future, you create the illusion of control and stability, where often there is none. Order is created out of chaos; it is a comforting illusion.

For investors, one of the biggest risks of forecasting is the unfortunate tendency to stay wedded to predictions. Consider as an example the person who makes a bearish or bullish forecast. The market then goes against them. Rather than admit the error, they double down on the claim. The fear isn’t only that of being wrong, but looking even more foolish as they capitulate just as the unexpected move comes to an end. This fear has caused legions of investors to miss big gains or to sell at the lows after a crash.

You can avoid that sort of behavior by building in rules that reduce these sorts of errors. The line in the sand — aka as a stop loss — is where one must admit a trade isn’t working out, and then unwind it. This helps avoid those sorts of errors. Famed technician Ned Davis once described this issue posing a question: “Do you want to be right or do you want to make money?”

Refusal to admit error is an obvious ego-driven foible, but I keep coming back to the bigger issue of pretending we can accurately discern the future. We can acknowledge our inability to foresee what will happen. We can also undertake specific steps to try to have a better process for thinking about what we do and don’t know about the future in order to make better decisions, as Wharton School professor Phillip Tetlock has shown.

The sooner we understand what is and isn’t knowable, the better off we — and our portfolios — will be.



  1. That book also makes the point that fear of the future has led to the rise of every religion. That is a discussion best saved for another time.

Originally: The Philosophical Failings of Forecasting

Print Friendly, PDF & Email

What's been said:

Discussions found on the web:

Posted Under