Targeting anti-vaccine content: Why YouTube has taken this decision and what it entails

By: |
October 04, 2021 3:18 PM

The development has come at a time when there is a general criticism that social media platforms are not making enough efforts to curb the misinformation that is being spread around COVID-19 and the vaccines.

YouTube stated that its Community Guidelines already contained provisions banning content that included medical misinformation

Anti-vaccine content on YouTube: Last week, Google-owned video streaming service YouTube announced the expansion of its medical misinformation policies, adding new guidelines on vaccines. The new guidelines would cover both COVID-19 vaccines as well as vaccines for other diseases. The development has come at a time when there is a general criticism that social media platforms are not making enough efforts to curb the misinformation that is being spread around COVID-19 and the vaccines for it.

As per the new policy that came into effect last week, any content that claimed that approved COVID-19 vaccines caused illnesses like cancer, infertility or autism, along with content claiming that the vaccines had substances that could track the beneficiaries would be removed from the platform. Apart from that, content that levelled false allegations about the approved vaccines being dangerous and leading to long-term health effects would also be removed, along with any content claiming that the vaccines did not reduce transmission or contraction of the disease. Any content generally spreading misinformation about the substances included in the vaccines would also be removed.

YouTube stated that its Community Guidelines already contained provisions banning content that included medical misinformation, including promotion of harmful remedies like claims that diseases could be cured by drinking turpentine. YouTube also highlighted that the platform was targeting misinformation regarding COVID-19 and other medical aspects since the beginning of the pandemic, stating that it had already removed more than 1.3 lakh videos that violated the COVID-19 vaccine policies.

As per YouTube, content that encouraged using home remedies or prayers and rituals instead of consulting doctors in cases related to COVID-19 are a part of misinformation, along with content that claimed of there being a guaranteed cure of COVID-19. Apart from that, content promoting use, effectiveness or safety of Ivermectin or Hydroxychloroquine as treatment for COVID-19, as well as other content that generally discourages people from seeking medical help also form part of the misinformation that the platform is targeting.

Targeting this kind of content is important since there is a lot of vaccine hesitancy among people who often fall prey to conspiracy theories that are presented as facts across social media platforms.

Get live Stock Prices from BSE, NSE, US Market and latest NAV, portfolio of Mutual Funds, Check out latest IPO News, Best Performing IPOs, calculate your tax by Income Tax Calculator, know market’s Top Gainers, Top Losers & Best Equity Funds. Like us on Facebook and follow us on Twitter.

Financial Express is now on Telegram. Click here to join our channel and stay updated with the latest Biz news and updates.

Next Stories
18,867 new COVID-19 cases, 67 dead in Kerala
2Low pressure area over Bay of Bengal
3Govt removes export curbs on all diagnostic kits, reagents