We Should Fail Better

There’s a Right Way and a Wrong Way to Fail
The options are: openly examine past mistakes and learn to avoid a reckoning – or repeat them.
Bloomberg, May 16, 2018



I have been thinking about errors, mistakes and failures ever since I traded my first stock decades ago. Good traders expect to be wrong, but that attitude is surprisingly rare in business. That is a shame, because having a healthy outlook on failure would benefit corporations, governments – just about everyone.

We shall dispense with the usual tired tales — yes, we all know that New England Patriot quarterback Tom Brady was picked in the sixth round of the National Football League draft, and that all-time basketball great Michael Jordan didn’t make his high school varsity team. Instead, let’s consider how we can better incorporate data into our processes; open versus closed approaches; and, how we can learn to fail better.

If data is involved, then survivorship bias is not far behind. My favorite example involves Abraham Wald, a mathematician at Columbia University. Wald was a member of War Department’s Statistical Research Group during World War II. 1

In “How Not to Be Wrong: The Power of Mathematical Thinking,” Robert Ellenberg describes how Wald addressed the challenge of armoring bombers so they could survive the fearsome attacks of fighter planes and anti-aircraft fire. The Center for Naval Analyses had performed a study showing the damage patterns of returning aircraft. Its recommendation was to add armor to those areas that showed the most damage: on the plane’s wings, fuselage and tail. Wald rejected that, noting if a plane could return with its wings shot up, that was not where armor was needed. Instead, he advised considering the larger data set of all planes, especially the ones that did not return. “The armor doesn’t go where the bullet holes are. It goes where the bullet holes aren’t,” he explained. “On the engines.”

High stakes make aviation an excellent subject for the study of failure. In other fields, errors may be subtle, and the results not recognized for years. When there is a flying failure, planes fall out of the sky, and footage of the wreckage is on the evening news.

Matthew Syed points this out in “Black Box Thinking: Why Most People Never Learn From Their Mistakes (But Some Do).” Aviation is an open, data-rich system, with statistics going back a century: In 1912, the U.S. Army had 14 pilots, and even before the war, more than half (eight) would die in crashes.2 The Army (this was before the Air Force was established) set up an aviation school, to teach pilots how to fly more successfully. Unfortunately, the school had a 25 percent mortality rate.

Fast-forward a century. Syed observed that in 2013, there were 36.4 million commercial flights worldwide carrying 3 billion passengers. That year, there were only 210 commercial aviation fatalities. For some context, 1 million flights resulted in 0.41 accidents. An average of 2.4 million flights were needed for a single accident. Last year (2017), zero commercial airline passengers died. That is an astounding improvement over the course of a century.

How did the industry achieve this? By being self-critical and learning from accidents. Every accident, each crash (or near miss) gets studied extensively. The Federal Aviation Administration requires all large commercial aircraft to have a cockpit voice recorder and a flight data recorder to create a comprehensive and objective data set to allow for the full study of failure. Even the famed black boxes themselves are subject to exhaustive review and improvement. Today, these boxes are orange — making them much easier to spot in difficult terrain or underwater — and have submersible locator beacons to aid in their detection and retrieval from the ocean. It’s the perfect metaphor for how self-critical the industry is about safety.

Compare this with a closed system, like health care and hospitals. That industry has a very different approach, with vastly inferior results.

How different? Syed notes the remarkable contrast between air travel and preventable medical errors,3  which might result in as many as a half-million deaths in the U.S. at a cost estimated at $17 billion a year. After heart disease and cancer, medical errors are the No. 3 cause of death in America. Peter Pronovost, clinician at John Hopkins Medical School, wondered how we would respond if each day two 747 jumbo jets fell out of the sky killing roughly 900 people. That’s how many people die daily from medical errors.

Why is health care so different from aviation? First, there is little publicly available data and no sort of standardized review process when errors occur. Whatever self-examination takes place is private and is sealed and not readily available for public scrutiny. There is an attitude among some that doctors are infallible saviors, creating a reluctance to admit error. Insurance costs, litigation and protecting reputations reduce the desire for a public accounting. In short, health care is everything that aviation is not.

Finance straddles the two approaches. There obviously is a great deal of data, but it isn’t the most open of systems. Security and Exchange Commission rules mandate disclosures by mutual funds, but require much less from hedge funds, venture capital, private equity, brokers and registered investment advisers.

When Bear Stearns Cos.4 collapsed in March 2008, it wasn’t merely a harbinger of the coming financial crisis, it was a reminder that no company was immune from existential failure. Public companies are reluctant, and often strongly resist, attempts to document and openly assess their failures. Perhaps they are not the ideal model to look to when thinking about failure.5

Silicon Valley, technology and the venture-capital business model do a better job. Entrepreneurs and venture funders alike wear their failures like a badge of honor. Many venture capitalists even post their biggest misses on their websites. They recognize their model is to make a lot of losing bets in pursuit of finding the next big winner. Equity investors don’t have quite the same model, but they would benefit from a similar approach to recognizing their own limitations.

The stigma that surrounds failure needs to go. The surest way to avoid future failure is to embrace and learn from past failures.


1. It was described as “the Manhattan Project for equations.” The work was deemed so important to the war effort that much of what the group developed remained classified for decades after.

2. Even before there was a commercial aviation industry, the military kept records on accidents.

3. There are several studies of preventable, fatalmedical errors and the results vary widely. The studies include:
-The American Institute of Medicine, which estimated the range at 44,000 to 99,000 a year
-The Harvard study, which places the figure at more than 120,000, and
-The Journal of Patient Safety, which placed the figure at 400,000.
-Other deaths from improper care at nursing homes, pharmacies, outpatient clinics and private offices could bring the figure to more than 500,000 a year.

4. My research on the impact of the collapse of Bear Stearns eventually led me to write the book“Bailout Nation, with New Post-Crisis Update: How Greed and Easy Money Corrupted Wall Street and Shook the World Economy.”

5. Jeff Bezos, founder and chief executive officer of Amazon.com Inc., is one of the few executives who discusses failure openly. “To invent you have to experiment, and if you know in advance that it’s going to work, it’s not an experiment. … Invention requires a long-term willingness to be misunderstood. Companies that don’t embrace failure and continue to experiment eventually get in the desperate position where the only thing they can do is make a Hail Mary bet at the end of their corporate existence.”



Originally:  There’s a Right Way and a Wrong Way to Fail


Print Friendly, PDF & Email

Posted Under