Over 30,000 Vaccine Misinformation Videos Deleted By YouTube In Last 5 Months

The social media platform YouTube has said that it had removed more than 30,000 videos related to misleading Covid-19 vaccination in the last five months.

The reason for the removal was that the videos contradicted vaccine information from the World Health Organization (WHO) or health authorities such as the NHS of the UK, said a YouTube spokeswoman.

Videos spreading vaccine misinformation was banned by the video sharing platform in October last year in an effort to curb efforts to discredit the vaccines.

More than 800,000 videos for coronavirus misinformation were removed by it over the past year, the social media platform said. In addition to vaccines, the wider “medically unsubstantiated” claims about the virus was included in that figure, the company said. It also includes videos that make false claims that the vaccines kill people, causes infertility, or contains a secret microchip that will be implanted into recipients.

Manu conspiracy theories about the Covfid-19 pandemic and even false claims of non-existent “cures” were spread on YouTube in the early stages of the pandemic.

For YouTube and other social platforms, finding and deleting videos and other content on misinformation on Covid-19 and its vaccines is still tough despite its ban on such content.

There has however been widespread criticism of the slow speed at which the social media companies have acted over harmful disinformation throughout the pandemic. In recent months, attention has turned to how much they have allowed falsehoods about the vaccine to proliferate on their platforms.

Generally, YouTube acts early with respect to implementing policies to deal with issues of misinformation and harmful content.

Analysts have claimed that the anti-vaccine content found online has indeed affected people as it has made some of them scared to take the vaccine – which is touted as only way to get rid of the Covid-19 pandemic.

What has made the situation worse is the use of sophisticated tactics by the minority of committed activists engaged in spreading of harmful anti-vaccine content onlinewhich makes it more problematic for social media platforms like YouTube.  

One can still see videos on YouTube of people brandishing medical credentials to in support of false vaccine claims which can have an impact on people who already have developed concerns about the vaccines. These kinds of videos have thrived on platforms such as YouTube – and a number of the main culprits still use YouTube to cultivate an audience of hundreds of thousands of subscribers.

(Adapted from BBC.com)

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s