There is a very interesting article in Wired this month, ostensibly about the tribulations of the modern scientific method, big pharma’s drug development approach, etc. But within the article is an excellent digression about the complexities of causation:
“Causes are a strange kind of knowledge. This was first pointed out by David Hume, the 18th-century Scottish philosopher. Hume realized that, although people talk about causes as if they are real facts—tangible things that can be discovered—they’re actually not at all factual. Instead, Hume said, every cause is just a slippery story, a catchy conjecture, a “lively conception produced by habit.” When an apple falls from a tree, the cause is obvious: gravity. Hume’s skeptical insight was that we don’t see gravity—we see only an object tugged toward the earth. We look at X and then at Y, and invent a story about what happened in between. We can measure facts, but a cause is not a fact—it’s a fiction that helps us make sense of facts.”
I am not sure I completely endorse Hume’s perspective, but I recognize the very insightful Truth he discovered about the concept of causation: We humans love a grossly over-simplified narrative. It is the way we evolved. There are situations where long contemplation before taking action an be fatal — using shortcuts is effective and functional; it helps us to make snap decisions in the wild:
“The truth is, our stories about causation are shadowed by all sorts of mental shortcuts. Most of the time, these shortcuts work well enough. They allow us to hit fastballs, discover the law of gravity, and design wondrous technologies. However, when it comes to reasoning about complex systems—say, the human body—these shortcuts go from being slickly efficient to outright misleading.
A century and a half after Hume, Belgian psychologist Albert Michotte conducted studies in the 1940s, discovering “the launching effect,” a universal property of visual perception. The human visual process is filled with cognitive extrapolation. We construct our understanding of the world visually with much less data than we realize. Michotte showed how people rationalize what they see, creating false narratives that facilitate navigating the world:
“There are two lessons to be learned [from Michotte’s experiment’s]. The first is that our theories about a particular cause and effect are inherently perceptual, infected by all the sensory cheats of vision. (Michotte compared causal beliefs to color perception: We apprehend what we perceive as a cause as automatically as we identify that a ball is red.) While Hume was right that causes are never seen, only inferred, the blunt truth is that we can’t tell the difference. And so we look at moving balls and automatically see causes, a melodrama of taps and collisions, chasing and fleeing.
The second lesson is that causal explanations are oversimplifications. This is what makes them useful—they help us grasp the world at a glance. For instance, after watching the short films, people immediately settled on the most straightforward explanation for the ricocheting objects. Although this account felt true, the brain wasn’t seeking the literal truth—it just wanted a plausible story that didn’t contradict observation.”
Of course, those survival aids don’t work well when it comes to complex risk analysis in financial markets:
“This mental approach to causality is often effective, which is why it’s so deeply embedded in the brain. However, those same shortcuts get us into serious trouble in the modern world when we use our perceptual habits to explain events that we can’t perceive or easily understand. Rather than accept the complexity of a situation—say, that snarl of causal interactions in the cholesterol pathway—we persist in pretending that we’re staring at a blue ball and a red ball bouncing off each other. There’s a fundamental mismatch between how the world works and how we think about the world.”
This has come up repeatedly over the years — especially with regard to the financial crisis and the Big Lie. It is not too generous to say that these folks are not evil, they merely carry the vestiges of evolution — a flawed analytical engine of which they seem wholly unaware of.
For the rest of us, there is self-enlightenment. If we develop some awareness of these analytical errors, or our cognitive foibles, we can at least stand a fighting chance to go beyond erroneous perceptions towards truer understanding:
“The good news is that, in the centuries since Hume, scientists have mostly managed to work around this mismatch as they’ve continued to discover new cause-and-effect relationships at a blistering pace. This success is largely a tribute to the power of statistical correlation, which has allowed researchers to pirouette around the problem of causation. Though scientists constantly remind themselves that mere correlation is not causation, if a correlation is clear and consistent, then they typically assume a cause has been found—that there really is some invisible association between the measurements.”
In other words, yes, we can figure out actual causation.
Where I part ways with Hume is in looking at causation analysis as merely competing stories tying two facts together. It is larger than that, there is Causation-in-Fact. At the very least, we can eliminate the narratives that are demonstrably false. But to do so, we need to avoid the over simplifications, the correlation errors, the misapplied data, and recognize the complexity of causation in the world of finance.
For investors, the alternative is to live in a world of expensive cognitive errors . . .
>
Source:
Trials and Errors: Why Science Is Failing Us
Jonah Lehrer
Wired January 2012
http://www.wired.com/magazine/2011/12/ff_causation/all/1
What's been said:
Discussions found on the web: