In the wake of the US election, critics have blamed Facebook for bringing about—at least in part—Trump’s surprise win. A BuzzFeed report showed that Facebook users interacted far more with “fake news” stories about both candidates than they did with mainstream news outlets before the election. This wouldn’t seem like such a big deal if it weren’t for a Pew Research Center survey showing that 44% of Americans rely on Facebook to get their news.
But proving whether fake news influenced the election more than the usual political propaganda is impossible. What’s certain is that fake news on Facebook is a symptom of a larger problem: the company is trying to play contradictory roles as both trustworthy news publisher and fun social media platform for personal sharing. The problem is that it cannot be both—at least not without making some changes.
Facebook shapes people’s perceptions
When you log into your Facebook account, your default page is dominated by a cascading “news feed,” automatically selected for your pleasure, which consists of whatever your friends have shared. The company uses a mix of secret-sauce algorithms to choose which pieces of news you see. Some items are displayed based on what you’ve responded to before. For example, if you always like or reply to news from Trevor but ignore news from Mike, you’re going to see more Trevor and less Mike.
Other news that Facebook thinks you probably want to see is fed to you based on your profile. The Wall Street Journal has an incredible infographic on this, showing how Democrats on Facebook see liberal-leaning stories and Republicans see conservative-leaning ones.
Mark Zuckerberg has protested since the election that it’s preposterous to believe that Facebook could change anyone’s minds based on their news feed—yet the company behaves as if it does. Given that Facebook’s main goal is to serve you ads and get you to buy things, their number-one priority is keeping you glued to your feed. If you see a bunch of things you hate in your feed, you’re going to stop looking at it and take your clicks elsewhere. Common sense dictates that Facebook should avoid showing you news that will upset you or make you angry.
But Facebook’s decision goes beyond common sense. It’s based on real data the company gathered in a 2012 experiment in which algorithms fed users positive and negative stories to see whether they would affect people’s moods. Sure enough, the people fed positive news responded with more indicators of happiness. To the extent that people at Facebook believe their own data analysis, they know that the news feed affects people’s emotions and shapes their perceptions of the world. Their business depends on it.
Though there’s something ineffably creepy about Facebook manipulating our emotions, the site is no different than any other ad-driven business. Television series toy with your feelings to keep you coming back for more, even after you’ve seen the same stupid ad on Hulu nine times. Hollywood pumps out sequels to get you to pay $16 for a ticket. Facebook’s big innovation was the discovery that it could sell ads against people’s friendship networks. We consume each other’s posts on Facebook the same way we consume new episodes of Mr. Robot and with the same result. Our eyeballs translate into ratings, which translate into ad dollars.
The trouble with “news feed”
The problem is that Facebook decided to go beyond friendship networks. For several years now, the company has courted professional news outlets, ranging from the New York Times and BuzzFeed to Breitbart and FOX News, promising them prominent placement in users’ feeds. This courting intensified with the creation of the Facebook Instant service in 2015, which allows media companies to publish stories directly on Facebook and share ad revenues.
But then, earlier this year, Facebook announced that it would no longer be giving special weight to news coming from pages that belong to media outlets. From that point on, they would be weighting news from “friends and family” more heavily than from news outlets. Your news feed would not discriminate between stuff your friends passed along and news sources.
This rule had one exception. To the right of the news feed on your default Facebook page is the “trending” module. This module surfaces news that Facebook readers are discussing, so it’s never sourced to friends and family. For a while, Facebook had a team of human editors who curated trending news from a long list of reputable sources. But after several disgruntled ex-employees accused the news team of censoring right-wing stories, Facebook fired them and set up an algorithm to take their places.
With no human checks on it, that algorithm immediately started posting fake news.
Facebook’s algorithms are great at keeping people glued to their screens, but they are terrible at distinguishing real news from fake. Yet the most prominent feature of Facebook is called a “news feed.” Given the company’s almost hilarious inability to identify news, this feature clearly has a misleading name.
“But what is news?” you might ask. “Are you talking about liberal media?” No. I am talking about stories published by professional media organizations—organizations that take legal and ethical responsibility for what they publish. They pledge to print news that is the truth. They may define truth differently, depending on whether they’re Mother Jones or the National Review. Nevertheless, they come up with a definition and attempt to stick to it.
Plus, if those publications post articles that are libelous, infringing, obscene, or otherwise unlawful, they can be held liable in a court of law for what they’ve published. In short, news comes from organizations that stand by what they post as the truth. That’s why the public can trust news from a professional media organization more than that rant Uncle Tommy posted about chemtrails on Facebook. Unfortunately, however, Uncle Tommy’s rant is classified as part of the same “news feed” that contains headlines from your most trusted professional media sources.