Misinformation and conspiracy theories around Covid-19 vaccinations will not attempted to ne dispelled by tech companies including Facebook, Twitter and Google together with governments including the United Kingdom and Canada.
Standards for tackling misinformation across the various digital platforms and how organizations accountable for their failure to do so can be held responsible is being worked out by the new working group which was formed by the British fact-checking charity Full Fact.
“Bad information ruins lives, and we all have a responsibility to fight it where we see it,” said Full Fact’s chief executive, Will Moy. “The coronavirus pandemic and the wave of false claims that followed demonstrated the need for a collective approach to this problem.
“A coronavirus vaccine is now potentially just months away. But bad information could undermine trust in medicine when it matters most, and ultimately prolong this pandemic.”
The Department for Digital, Culture, Media and Sport of the UK amd Privy Council Office of Canada are also part of the partnership in addition to the three technology companies. Fact-checkers from South Africa, India, Argentina and Spain, the Reuters Institute for the Study of Journalism, and the journalism non-profit First Draft are also part of the efforts.
Facebook will provide the initial funding support which will be used by Full Fact to draft the initial framework for January 2021. Facebook and Full Fact have also worked together on previous occasions. The fact-checking partner for Facebook’s anti-misinformation programme in the UK was Full Fact.
“Working together to tackle misinformation is really important, especially bad content around the Covid-19 pandemic right now,” said Keren Goldshlager, the head of integrity partnerships at Facebook.
“We’ve seen huge value in partnering with over 80 independent fact-checkers globally to combat misinformation in 60 languages. We welcome this effort to convene more tech companies, fact-checkers, researchers and governments to discuss and develop new strategies, so that we can work together even more effectively in the future.”
Even before the imminent introduction of a Covid-19 vaccine has increased the importance and urgency of the issue, a challenge for social networks also long existed with respect to vaccine misinformation. Even though Facebook’s founder, Mark Zuckerberg, led a $3bn charitable effort to “cure all diseases”, anti-vaccination content was freely allowed by Facebook for years. The social me3dia company relented slight in March 2019, with the banning of anti-vax ads including misinformation about vaccines. And then the platform banned all anti-vax advertising in October this year except for those that carried a political message.
But the platform still allows “organic content” which are posts and groups advocating against vaccines. There is no explicit ban on misinformation in that category even though the social media can flag content for review by third-party fact-checkers.
Serious action against vaccine misinformation is being taken by YouTube only6 recently. A ban aimed specifically on misinformation about Covid vaccinations was announced by Google’s video-sharing site YouTube this October after a week of policy changes on the issue being announced by Facebook.
(Adapted from TheGuardian.com)