The Trouble with Facebook

Facebook Didn’t Tilt the Election
The social network’s news feed spreads fake news. But that changed few, if any, minds.
Bloomberg,  November 16, 2016

 

 

 

Mark Zuckerberg, co-founder and chief executive officer of Facebook Inc., finds himself in some hot water. His company is being blamed(or credited, depending upon your point of view) for Donald Trump’s election because its algorithms facilitated the circulation of misleading or false news stories.

This controversy is misdirected: it should be less about Facebook’s algorithms and more about human cognitive issues. Here is yet another lesson for investors. This will be our third and likely final extrapolation from the presidential election; earlier discussions looked at hindsight bias and fabricated narratives.

Today’s topic is confirmation bias. This is the tendency for people to seek out news, information and opinion that reinforces existing beliefs. We pay more attention to, interpret more favorably and tend to remember the things with which we agree. The opposite is also true: We tend to not notice, interpret unfavorably and more easily forget that which is at odds with what we already think. Selective attention, perception and retention are part of the broader confirmation bias that afflicts almost everyone. It has been called the “compulsive yes-man” in your head who “echoes whatever you want to believe.”

For investors, this is an insidious problem. Traders hold stocks that have run into trouble for way too long. Instead of cutting their losses, they seek out research analysis, corporate interviews and news stories that help them rationalize holding onto the position. The setback is only temporary, they tell themselves, and finding things that support that view helps them emotionally.

The flip side is true as well. Bearish investors have ignored positive improvements in the economy and corporate profits this cycle despite the 250 percent rally in the Standard & Poor’s 500 Index.

Avoiding confirmation bias (including selective perception and retention) is not easy to do, but here are a few tricks that might help:

  • Force yourself to seek out and read opposing points of view. You may find this to be an incredibly frustrating exercise — everyone who disagrees with you is, obviously, an idiot — but ultimately, it will make you a sharper thinker.
  • Engage in an exercise, or something like it, used in law school known as moot court. These exercises force you to make a cogent and coherent argument for the opposing side’s case. If you are long, you should be able to make the case for the opposite bet and vice versa.
  • Make an objective list of weekly positive and negative events. Write them down. This is something I do. It’s a practice I shamelessly stole from Peter Boockvar of Bookmark Advisors, who came up with the idea many years ago.

I am less concerned than some folks are about Facebook’s ability to shape the American political landscape. For example, Slate’s Will Oremus wrote that “Facebook’s news feed algorithm is bound to spread lies, especially those that serve to bolster people’s preconceived biases. And these falsehoods are bound to influence people’s thinking.” I think he’s right about the first half, but I wonder how much this truly affected anyone’s thinking. Saying Facebook’s news feed serves to reinforce existing views is surely correct, but changing the outcome of the election?

I am skeptical that social-media posts changed many votes; people aggressively avoid ideas that challenge their basic philosophy and opinions. Social media has become an echo chamber, an exercise in preaching to the choir and other members of your own tribe. I’ll end this by posing just one request: If anyone switched their vote because of a Facebook post or a 140-character squib on Twitter, please let me know. I’ll be waiting.

 

Originally: No, Facebook Didn’t Tilt the Election  

Print Friendly, PDF & Email

Posted Under