Facebook is enmeshed in another controversy, this time over accusations that the firm Cambridge Analytica abused Facebook data to help Donald Trump win the 2016 US presidential election. But this is a big deal fundamentally because of a larger and more fundamental problem: Facebook is bad.
Lots of companies, to be clear, are built around products that are bad. Indeed, being bad is by no means an impediment to success in a capitalist economy. Cigarette companies, for years, made enormous profits off selling a highly addictive highly carcinogenic substance to millions of Americans. Even in their current somewhat fallen state, tobacco companies continue to be viable ongoing enterprises.
Alcoholic beverages are enjoyed in moderation by many, but the real profit in the industry lies with the minority of serious alcohol abusers who account for the lion’s share of consumption — often with deadly consequences. Casino gambling features a similar, albeit less directly deadly, addiction-based business model.
None of which necessarily implies any specific public policy approach — legal prohibition of alcohol rather famously caused a lot of problems. But I do think it’s true that executives of companies that make money by hurting their customers should feel kind of bad about themselves. Or at least not good.
And therein lies the problem for Facebook. Not only is the product bad, but the company is in a deep state of denial about it. Mark Zuckerberg and other top leaders believe they are making the world a better place. The labor market for the kind of talented engineers that Facebook needs to hire is robust enough that you can’t compete on the basis of money alone — they need to believe that Facebook is a decent, honorable place to work. But in fact, Facebook is bad. And it probably can’t be fixed.
The good news is that the executives have already made a lot of money and the workers have valuable, in-demand job skills. You could shut the whole thing down tomorrow and everyone would be fine.
Move fast and break human society
The association between Facebook and fake news is by now well-known, but the stark facts are worth repeating — according to Craig Silverman’s path-breaking analysis for BuzzFeed, the 20 highest-performing fake news stories of the closing days of the 2016 campaign did better on Facebook than the 20 highest-performing real ones.
Rumors, misinformation, and bad reporting can and do exist in any medium. But Facebook created a medium that is optimized for fakeness, not as an algorithmic quirk but due to the core conception of the platform. By turning news consumption and news discovery into a performative social process, Facebook turns itself into a confirmation bias machine — a machine that can best be fed through deliberate engineering.
In reputable newsrooms, that’s engineering that focuses on graphic selection, headlines, and story angles while maintaining a commitment to accuracy and basic integrity. But relaxing the constraint that the story has to be accurate is a big leg up — it lets you generate stories that are well-designed to be psychologically pleasing, like telling Trump-friendly white Catholics that the pope endorsed their man, while also guaranteeing that your outlet gets a scoop.
The sophisticates’ defense of Facebook is to question whether having half the country marinate in a cesspool of misinformation for an hour or two a day really swung any votes. And I suppose the answer may well be no.
But it certainly doesn’t help. And if you look at a society where Facebook plays a larger role in the information ecology, like Myanmar, you see a clear disaster emerging where United Nations human rights investigators say Facebook has been a clear dissemination channel for hate speech and propaganda that are driving an ethnic cleansing campaign that’s displaced more than 600,000 Rohingya people to Bangladesh and killed thousands.
“Connecting the world isn’t always going to be a good thing,” Facebook’s newsfeed chief Adam Mosseri told Slate’s April Glaser and Will Oremus on their podcast, acknowledging the disastrous reality. “We lose some sleep over this.”
I also lose sleep over a work screw-up sometimes, but I’m confident that I’ve never accidentally contributed to unleashing a genocide. But more to the point, while Facebook is now, thankfully, taking some steps to address the worst outlier behavior taking place on its platform in Myanmar, the core problem is that even non-extreme cases of heavy Facebook use seem harmful.
Destroying journalism’s business model is bad
Meanwhile, Facebook is destroying the business model for outlets that make real news.
Facebook critics in the press are often accused of special pleading, of hatred of a company whose growing share of the digital advertising pie is a threat to our business model. This is, on some level, correct.
The answer to the objection, however, is that special pleaders on behalf of journalism are correct on the merits. Not all businesses are created equal. Cigarette companies poison their customers; journalism companies inform them.
And traditionally, American society has recognized that reality and tried to create a viable media ecosystem. The US Postal Service has long maintained a special discount rate for periodicals to facilitate the dissemination of journalism and the viability of journalism business models. Until last fall, the Federal Communications Commission maintained rules requiring licensed local broadcast stations to maintain local news studios.
That Facebook’s relentless growth threatens the existence of news organizations is something that should make the architects of that relentless growth feel bad about themselves. They are helping to erode public officials’ accountability, foster public ignorance, and degrade the quality of American democracy.
Google, of course, poses similar threats to the journalism ecosystem through its own digital advertising industry. But Googlers can also make a strong case that Google makes valuable contributions to the information climate. I learn useful, real information via Google every day. And while web search is far from a perfect technology, Google really does usually surface accurate, reliable information on the topics you search for. Facebook’s imperative to maximize engagement, by contrast, lands it in an endless cycle of sensationalism and nonsense.
Facebook makes people depressed and lonely
A large and growing body of research confirms what probably ought to be obvious: Spending a lot of time alone, disengaged from other human beings, staring at your phone, and clicking on little buttons on a platform obsessively engineered by some of the smartest people on the planet to keep you staring and clicking is not good for you.
Holly Shakya and Nicholas Christakis conducted one of the best studies on this, partnering with Gallup to use a sample of thousands of people across three waves and looking at self-reported physical health, self-reported mental health, self-reported life satisfaction, and body mass index.
They find that “overall, the use of Facebook was negatively associated with well-being,” whereas networking socially in the real world was positively associated with well-being and “the negative associations of Facebook use were comparable to or greater in magnitude than the positive impact of offline interactions.”
A smaller study showed that when people spend time comparing their real lives to the idealized versions of themselves that others present on Facebook, it leads to depression.
A separate study showed that Facebook use — but not general internet browsing — leads to negative mood driven by “a feeling of having wasted time.” The study also finds that users make a systematic “forecasting error” and predict that logging on will improve their mood when, more often than not, it does the reverse.
By December of 2017, even Facebook’s in-house research team was admitting that using Facebook the way Facebook is generally used in reality is harmful to users’ mental health and well-being.
The Facebook internal team’s fig leaf rationalization was to point out that using Facebook to have meaningful interactions with close friends and family makes people happier. It’s of course true that such meaningful interactions are valuable, and also true that Facebook contains some functionality that facilitates them.
But lots of technology companies offer messaging services — Facebook’s unique value proposition is its ability to “connect the world” and push you into endless cycles of interacting with strangers, quasi-strangers, and brands.
They should turn off Facebook
The latest Facebook scandal is creating a new wave of people performatively deleting their Facebook accounts, and that’s fine. But fundamentally, thanks to network effects, it is hard to quit Facebook.
I need to use Facebook to promote my work on Facebook. In an ideal world, I would have no activity on Facebook other than self-promotion via my Facebook brand page, but in order to do that, I have to have a Facebook account.
Since the account is there and since many other people use Facebook, that means I sometimes get messages on Facebook. And since I don’t want to systematically ignore people who are trying to get in touch with me, that makes me get sucked into use. And because almost everyone is on Facebook (even me!), people often send invitations to social engagements via Facebook, and to try to opt out is to make yourself a difficult person.
Besides which, when you do dip into Facebook, it’s a genuinely engaging compelling product — some of the brightest, hardest-working people in the world have toiled for years to keep you ensnared.
For a better path forward, it’s worth looking at the actual life of Facebook founder Mark Zuckerberg.
He likes to do annual personal challenges, and they are normally sensible. One year, he set about to learn Mandarin. Another year, he challenged himself to run 365 miles. He visited all 50 states and met and spoke face to face with people in each state he visited. He committed to reading a book cover to cover every two weeks.
This year, his challenge is to try to fix Facebook. But he ought, instead, to think harder about those other challenges and what they say about what he finds valuable in life — sustained engagement with difficult topics and ideas, physical exercise, face-to-face interaction with human beings, travel. This suggests a healthy, commonsense value system that happens to be profoundly and fundamentally at odds with the Facebook business model.
To simply walk away from it, shut it down, salt the earth, and move on to doing something entirely new would be an impossibly difficult decision for almost anyone. Nobody walks away from the kind of wealth and power that Facebook has let Zuckerberg accumulate. But he’s spoken frequently about his desire to wield that wealth and power for good. And while there are a lot of philanthropists out there who could donate to charities, there’s only one person who can truly “fix” Facebook by doing away with it.
Vox · by Matthew Yglesias · March 21, 2018