How to Fix Facebook


Facebook Inc. has a fake news problem. Its founder and Chief Executive Officer Mark Zuckerberg doesn’t seem to understand that.

In a revealing and embarrassing interview in Recode, Zuckerberg likened being a Holocaust denier to merely being wrong. His comment is worth citing:

“I’m Jewish, and there’s a set of people who deny that the Holocaust happened. I find that deeply offensive. But at the end of the day, I don’t believe that our platform should take that down because I think there are things that different people get wrong. I don’t think that they’re intentionally getting it wrong, but I think it’s hard to impugn intent and to understand the intent. I just think, as abhorrent as some of those examples are, I think the reality is also that I get things wrong when I speak publicly.”

This, in a nutshell, reflects the degradation of Facebook, which has been polluted with so much nonsense for so long that its utility for users is now potentially in permanent decline.

Let’s come back to all that after some context: Facebook and Twitter Inc. suffered large setbacks this week, after they reported quarterly earnings disappointment. Twitter declined, but the scale of Facebook’s share plunge was one for the record books: on Thursday, it lost $119 billion in market value, the biggest one-day evaporation of corporate equity wealth ever. (Morgan Housel noted Mark Zuckerberg is down to his last $70 billion). If he were smart, he would be diversifying away from his over-concentration in that one stock.

My colleague Shira Ovide looked at four theories as to why Facebook has suffered this setback, and they are worth reading. Another explanation is that “years of privacy missteps” are finally catching up with the company. I maintain, however, that the company’s core demographic doesn’t really care much about privacy concerns.1

What ails the company (and Twitter to a lesser degree) is something bigger. The site is, and always has been, unable or unwilling to manage the scammers, spammers and fakers. Worse yet, it is now overrun with a new class of users — or rather, misusers: gaslighters, trolls and haters, who foul the site with fake news, false memes, conspiracy theories and misleading junk.

This isn’t a privacy concern; it is a trust issue. People have figured out that what is on Facebook too often is untrustworthy, whether it’s a false story about pedophile kidnappers operating in a Washington pizza parlor, or claims that Robert Mueller, the special counsel probing Russian meddling in the U.S. presidential election, is a child rapist. 2

But the bottom line is this: once you welcome holocaust deniers onto to your property, because, Hey! Its just their opinion! you have crossed a certain line. Other people — moral, rational, intelligent human beings who prefer reality to the fake ideological insanity these folks live in — they begin to say good-bye, and in increasingly larger numbers. People like Alex Jones who say Sandy Hook never happened, the birthers 9/11 truthers — thats a dinner party I won’t be attending.

Facebook has said it is adopting corrective measures, but it may turn out to be too little too late.

As Scott Galloway,3  digital marketing professor at New York University’s Stern School of Business has noted repeatedly, Facebook has tried to hide behind the claim of being nothing more than a platform. That excuse seems to no longer resonate with its users.

The problem is a fundamental misunderstanding on Zuckerberg’s part of the obligations of publishers and the First Amendment.

I have thought long and hard about this issue. During the height of the financial crisis, my own site was inundated with people who purposely tried to confuse the public about the truth. Article comments sections were once a place where ideas where robustly debated; that ended when trolls figured out they could free ride on other people’s labors to gain a broad platform for exposure of their false and misleading ideas. This is why I have closed comments on all of my columns, as have any number of others. The debate has since moved on to social media.

This isn’t a free speech issue; there have never been more ways to write and say whatever you want. Instead, it is a private-property issue. Private non-governmental companies are free to create rules to best manage their own properties. It’s as if you get drunk and insult all of the guests at my cocktail party; I have no obligation to invite you back.

As the public has figured out that Facebook’s news feed has way too much garbage in it, they have moved on. Some have shifted to other more narrowly focused Facebook properties, such as Instagram, Messenger or What’s App, or to Twitter. But the core property is at risk of being abandoned by large swaths of users. The sooner the company figures out how to stem this tide, the better.

I propose a simple three-step test for what determining what should not be circulated on the site as news:

  1. Is the item demonstrably false?
  2. Does it target a specific group for discrimination, harassment or abuse?
  3. Does the dissemination of the false claim hurt this group of people?

If the answer to all three of those questions is yes, than the posts/news items are deleted. If this nonsense amounts to a substantial enough chunk of their content, they get banned. They are free to find another place to post their false, abusive, bullshit.

Facebook can start with that as the basic operating premise, and refine it from there. This won’t solve all of its woes, but it may staunch the bleeding. But Zuckerberg should realize there is a tipping point where once a certain percentage of the crowd goes, you no longer have a Facebook problem — you have a MySpace problem.

The people who post false, hateful things have a fundamental flaw in their model of the world. There is no obligation on anyone’s part to indulge that.

Twitter and Google both seem to be figuring this out somewhat faster than Facebook, purging fake accounts and suspending abusive users. The solution is one that Facebook seems loathe to fully embrace. But not making a decision is a decision of itself.

As Galloway told me in an email exchange: “Facebook is beginning to feel like a tobacco company — lying, a disregard for people’s well-being, and investors are tiring of the stench.” Facebook was literally built on the loyalty of committed users. Unless the company change its — and sooner rather than later — those users will dismantle the company.




1. I am clearly not their target demographic. My lack of enthusiasm stemmed in large part from the lackadaisical approach Facebook employs regarding their users’ privacy. Regardless, I was never that interested in sharing that much of my private life. Twitter, which I can exercise more control over, seems to work better for me.

2. For the record, I was unconvinced that Facebook swung the election. I find the site to be a giant exercise in confirmation bias. However, I am willing to be convinced that perhaps on the margin just enough votes switched. Just can show me the data.

3. Hear our prior Masters in Business conversations here, here, here and here.



Print Friendly, PDF & Email

Posted Under