Facebook Inc chief executive Mark Zuckerberg said in a Facebook post very recently that the largest social media platform in the world is taking a series of steps to weed out hoaxes and other types of false information even as it faces withering criticism for failing to stem a flood of phony news articles in the run-up to the U.S. presidential election.
Facebook has long rejected the idea that it should be held responsible for the content that its users circulate on the platform and has been insisting that it is a technology company and not a publisher. The notion that fake or misleading news on Facebook had helped swing the election to Donald Trump was a “crazy idea”, said Zukerberg just after the election.
Calling “only a very small amount” fake news and hoaxes, last Saturday Zukerberg then said that more than 99 percent of what people see on Facebook is authentic.
But a decidedly different tone has been sounded by Zukerberg in his Friday posting. Calling the problem complex both technically and philosophically, he said Facebook has been working on the issue of misinformation for a long time.
“While the percentage of misinformation is relatively small, we have much more work ahead on our roadmap,” Zuckerberg said.
Including one for greater use of automation to “detect what people will flag as false before they do it themselves,” he outlined a series of steps that were already underway.
Facebook would explore posting warning labels on content that has been flagged as false, work with third-party verification organizations and journalists on fact-checking efforts and would make it easier to report false content, he also said.
The company will also try to prevent fake-news providers from making money through its advertising system, as it had previously announced.
Mistakenly restricting accurate content or discouraging sharing of opinions are issue that Facebook would be very careful about, Zuckerberg said. “We do not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties,” he said.
Zukerberg said that to determine if it can confidently classify stories as misinformation, Facebook historically has relied on users to report links as false and share links to myth-busting sites, including Snopes. Extensive “community standards” on what kinds of content are acceptable is present in the service.
After Facebook removed an iconic Vietnam War photo due to nudity, a decision that was later reversed by the social media platform, it faced international outcry earlier this year. the media has repeatedly reported quoting sources within the company that there have been extensive internal conversations at the company in recent months over content controversies and the thorniest content issues are decided by a group of top executives at Facebook.
Erroneous reports that a federal agent who had been investigating Democratic candidate Hillary Clinton was found dead and reports erroneously alleging Pope Francis had endorsed Trump were among the fake news reports that circulated ahead of the U.S. election.
(Adapted from Reuters)