The comments from the world’s biggest social network were its latest response to intense criticism for failing to stop the spread of misinformation among its two billion users – most strikingly leading up to the 2016 US election.
In a blog post, Facebook civic engagement chief Samidh Chakrabarti said that he was “not blind to the damage that the internet can do to even a well-functioning democracy.”
“In 2016, we at Facebook were far too slow to recognise how bad actors were abusing our platform,” he said. “We are working diligently to neutralise these risks now.”
The post – one in a series dubbed “hard questions” – was part of a high-profile push by Facebook to reboot its image, including with the announcement last week that it would let users “rank” the trustworthiness of news sources to help stem the flow of false news.
“We are as determined as ever to fight the negative influences and ensure that our platform is unquestionably a source for democratic good,” Katie Harbath, Facebook’s head of global politics and government outreach, in an accompanying statement said.
Facebook, along with Google and Twitter, faces global scrutiny for facilitating the spread of bogus news – some of it directed by Russia – ahead of the US election, the Brexit vote and other electoral battles.
The social network has concluded that Russian actors created 80,000 posts that reached around 126 million people in the United States over a two-year period. “It is abhorrent to us that a nation-state used our platform to wage a cyber war intended to divide society,” Chakrabarti said. “This was a new kind of threat that we could not easily predict, but we should have done better. Now we are making up for lost time,” he said.
Chakrabarti pointed at Facebook’s pledge last year to identify the backers of political advertisements – while also stressing the need to tread carefully, citing the example of rights activists who could be endangered if they are publicly identified on social media.
He also elaborated on the decision to let Facebook’s users rank the “trustworthiness” of news sources, saying: “We do not want to be the arbiters of truth, nor do we imagine this is a role the world would want for us.”
While acknowledging concerns over the rise of “echo chambers,” he argued that “the best deterrent will ultimately be a discerning public.”
Facebook’s plan to rank news organisations based on user “trust” surveys has drawn a mixed response. Renee DiResta of the non-profit group Data for Democracy was optimistic. “This is great news and a long time coming. Google has been ranking for quality for a long time, it is a bit baffling how long it took for social networks to get there,” she wrote on Twitter.
But technology columnist Shelly Palmer warned that Facebook appeared to be equating trust and truth with what the public believes – what some call “wikiality”. “Wikiality is Facebook’s answer to fake news, alternative facts, and truthiness,” Palmer wrote, adding, “Facebook, the social media giant, is going to let you rank the news you think is most valuable. What could possibly go wrong?”
For media writer Matthew Ingram, the changes “not only will not fix the problem of fake news but could actually make it worse instead of better”. Hhe wrote in the Columbia Journalism Review, “Why? Because misinformation is almost always more interesting than the truth.”
News Corp founder and executive chairman Rupert Murdoch also expressed skepticism, suggesting Facebook should instead pay “carriage feeds” to trusted news organisations, following the example of cable TV operators.
“I have no doubt that Mark Zuckerberg is a sincere person, but there is still a serious lack of transparency that should concern publishers and those wary of political bias at these powerful platforms,” Murdoch said in a statement issued by his group, which publishes the Wall Street Journal and newspapers in Britain and Australia.