Facebook’s fake-news problem is wildly out of control.
BuzzFeed is reporting that at the end of the US presidential election, the top malicious fake news stories actually outperformed legitimate news stories shared by some of the most popular media companies.
According to data from a Facebook-monitoring tool cited by BuzzFeed, the top 20 fake news stories collectively got more engagements – shares, likes, and comments – than 20 factually accurate news stories shared by mainstream news outlets.
In first place was the false “Pope Francis Shocks World, Endorses Donald Trump for President, Releases Statement,” which got a staggering 960,000 engagements when it was shared by Ending The Fed. The second-most-popular post cited by BuzzFeed was when The Washington Post shared the non-hoax “Trump’s History of Corruption is Mind-Boggling. So why is Clinton Supposedly the Corrupt One?”
Unscrupulous hoaxers are using Facebook to spread bald-faced lies – and there’s a growing body of evidence to suggest that they may have had an impact on people’s voting decisions in a historic presidential election.
At the same time, Facebook is burying its head in the sand.
Mark Zuckerberg's claims about fake news are contradictory
CEO Mark Zuckerberg has been extremely dismissive of the idea that fake news played a part in people's voting decisions and the election of Republican Donald Trump as president. "Personally, I think the idea that fake news on Facebook - it's a very small amount of the content - influenced the election in any way is a pretty crazy idea," he said at a conference after the November 8 election.
Is it really so crazy? Bear in mind a few things:
- A secret study by the social network found it was able to influence the moods of users by showing them more or less positive posts in their News Feed in what it referred to as "emotional contagion." In short: Facebook's own research suggests the type of content it shows its users can influence them. Facebook has a sophisticated program trying to persuade political organisations to buy advertisements on the site. Is it really so implausible that user opinions could be swayed by ads but not extremely popular and widely shared false news stories? "There's an entire political team and a massive office in DC that tries to convince political advertisers that Facebook can convince users to vote one way or the other," former Facebook employee Antonio Garcia-Martinez told NPR. "Then Zuck gets up and says, 'Oh, by the way, Facebook content couldn't possibly influence the election.' It's contradictory on the face of it." Zuckerberg himself has pointed out - in a defence of Facebook in the face of false-news criticisms, no less - that the site "helped more than 2 million people register to vote, and based on our estimates we got a similar number of people to vote who might have stayed home otherwise." So yes, Facebook can influence voting behaviour. Facebook has previously been misleading on issues inconvenient to it. In 2015, the social network told Recode the news stories in its Trending section were all chosen algorithmically - but leaked documents and guidelines subsequently showed that human curators could "inject" stories into the section.
'It's as if tobacco companies controlled access to all medical and hospital records'
In a subsequent Facebook post, Zuckerberg elaborated on his argument. He wrote (emphasis ours):
"Of all the content on Facebook, more than 99% of what people see is authentic. Only a very small amount is fake news and hoaxes. The hoaxes that do exist are not limited to one partisan view, or even to politics. Overall, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other.
"That said, we don't want any hoaxes on Facebook. Our goal is to show people the content they will find most meaningful, and people want accurate news. We have already launched work enabling our community to flag hoaxes and fake news, and there is more we can do here. We have made progress, and we will continue to work on this to improve further."
In short: Sure, fake news exists, but there's so little of it that it's doubtful it had any meaningful impact.
For starters, there's a major problem with this argument - no outside researcher can verify this data. As Zeynep Tufekci said in The New York Times: "Unfortunately, Facebook exercises complete control over access to this data by independent researchers. It's as if tobacco companies controlled access to all medical and hospital records."
BuzzFeed's report helps us quantify the sheer scale of the most viral fake news flying around. The 20 top posts it identified by the Facebook pages for fake-news sites racked up a cumulative 8,711,000 engagements, beating out the posts by a group of mainstream news outlets, which got 7,367,000 engagements.
Combined with what we know of Facebook's ability to influence its users, the idea that it affected the US presidential election suddenly seems more likely.
The uptick in fake news engagement matches when Facebook put "friends first" over traditional publishers June 30th https://t.co/O5eHNblHdG pic.twitter.com/fZ43a4ZO8r
— Josh Constine (@JoshConstine) November 17, 2016
Identifying fake news isn't always easy - but that doesn't mean you should do nothing
Hoaxes aren't a modern phenomenon. "A lie can travel halfway around the world while the truth is still putting on its shoes," goes a quote attributed (ironically, falsely) to Mark Twain. But Facebook is letting people monetise and weaponise falsehoods on an unprecedented scale.
Millennials now rely on Facebook far more than any other source for their political news, and the social network has a civic responsibility to ensure that people aren't being duped by blatant falsehoods spread by hucksters.
Of course, identifying fake news isn't always easy. What about factually incorrect stories published by legitimate news sites? Or vague satire? What about caustic opinion pieces? Or cynical lies peddled in the real world by politicians and talking heads?
Many of these would be extremely difficult for Facebook to police - and arguably, we shouldn't want such a company as the ultimate arbiter of truth in our public discourse.
But even erring on the side of caution, there's plenty that Facebook can do. Another BuzzFeed investigation found that teens from Macedonia are responsible for more than 100 pro-Trump websites peddling patently fictional news stories to cash in on advertising dollars. Aggressively penalising such demonstrably fake sites in News Feed rankings would erode the economic incentive to peddle lies and prevent users being misguided.
Even if only, say, 20% of the dubious content being shared on Facebook can be assessed with certainty as fake and addressed, that still makes a huge difference when you have an audience of billions.
Because the alternative - standing by while people turn a profit by selling falsehoods - risks poisoning public discourse and democracy.