The leaders of Facebook, the world's largest social network, which reaches 1.18 billion people every day, want to absolve the company from any responsibility for the potential proliferation of misinformation on its platform. As the dust settles following a polarized presidential election, Facebook is trying to defend itself against claims of impropriety and unchecked influence on American politics.
The social giant is taking hits from all sides of the political spectrum and CEO Mark Zuckerberg is walking a challenging line as he tries to placate Facebook's critics. In May, he met with a group of conservative leaders to dismiss reports that Facebook's team of curators in charge of its Trending Topics had deliberately suppressed stories from right-leaning news outlets. Then, in a blog post published four days after the election, Zuckerberg defended the social network as a neutral party that doesn't bear the same responsibilities as a news source and said Facebook should be "extremely cautious about becoming arbiters of truth ourselves."
Facebook advocates influence, shirks responsibility
Politics aside, the contradictions Zuckerberg made about the social network's influence and its potential impact on users could become a glaring problem. If the content, including any misinformation, that Facebook distributes to more than 1.79 billion people every month can't influence the outcome of an election, just how effective are the $6.8 billion in ads it sold during the third quarter of 2016?
Facebook proudly claims its highly targeted advertising can influence purchase decisions, and it accepts some responsibility for giving a voice to disenfranchised people who have organized uprisings in the Middle East and elsewhere around the world. Back home in the United States, however, Facebook tells a more nuanced story to boost the most important aspect of its success. There's no denying Facebook is the world's largest media platform, but the company is unwilling to deem itself a media company — and accept all the responsibilities that come with that distinction.
Zuckerberg publicly addressed Facebook's latest storm of criticism at a conference last week, but he dug in deeper in the blog post published Saturday. "Many people are asking whether fake news contributed to the [election] result, and what our responsibility is to prevent fake news from spreading," he wrote. "Of all the content on Facebook, more than 99 percent of what people see is authentic. Only a very small amount is fake news and hoaxes. The hoaxes that do exist are not limited to one partisan view, or even to politics. Overall, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other."
Facebook introduced new controls to flag fake news and admits there is more it can do to this end, but "identifying the truth is complicated," Zuckerberg wrote. "While some hoaxes can be completely debunked, a greater amount of content, including from mainstream sources, often gets the basic idea right but some details wrong or omitted."
Facebook can't have it both ways
Facebook's reluctance to identify itself as a media company has built for the past few years, as its influence grew accordingly. However, Zuckerberg sang a different tune as late as 2013 when the company introduced a News Feed redesign that aimed to reposition Facebook as a legitimate news source. "What we're trying to do is give everyone in the world the best newspaper we can," he said at Facebook headquarters, according to a ClickZ.com report. "We think there's a really important place for a personalized newspaper like this."
One of Facebook's greatest strengths is its capability to connect people and present them with relevant content using its proprietary algorithm. Zuckerberg and his team must now defend the very veracity of the content it distributes while downplaying its significance on culture and politics.
Ultimately, Facebook is not responsible for the outcome of a presidential election — that duty goes to the electorate — but it's a stretch to suggest the company has no significant influence on its users' opinions.