Anti-vaccine content on YouTube: Last week, Google-owned video streaming service YouTube announced the expansion of its medical misinformation policies, adding new guidelines on vaccines. The new guidelines would cover both COVID-19 vaccines as well as vaccines for other diseases. The development has come at a time when there is a general criticism that social media platforms are not making enough efforts to curb the misinformation that is being spread around COVID-19 and the vaccines for it.
As per the new policy that came into effect last week, any content that claimed that approved COVID-19 vaccines caused illnesses like cancer, infertility or autism, along with content claiming that the vaccines had substances that could track the beneficiaries would be removed from the platform. Apart from that, content that levelled false allegations about the approved vaccines being dangerous and leading to long-term health effects would also be removed, along with any content claiming that the vaccines did not reduce transmission or contraction of the disease. Any content generally spreading misinformation about the substances included in the vaccines would also be removed.
YouTube stated that its Community Guidelines already contained provisions banning content that included medical misinformation, including promotion of harmful remedies like claims that diseases could be cured by drinking turpentine. YouTube also highlighted that the platform was targeting misinformation regarding COVID-19 and other medical aspects since the beginning of the pandemic, stating that it had already removed more than 1.3 lakh videos that violated the COVID-19 vaccine policies.
As per YouTube, content that encouraged using home remedies or prayers and rituals instead of consulting doctors in cases related to COVID-19 are a part of misinformation, along with content that claimed of there being a guaranteed cure of COVID-19. Apart from that, content promoting use, effectiveness or safety of Ivermectin or Hydroxychloroquine as treatment for COVID-19, as well as other content that generally discourages people from seeking medical help also form part of the misinformation that the platform is targeting.
Targeting this kind of content is important since there is a lot of vaccine hesitancy among people who often fall prey to conspiracy theories that are presented as facts across social media platforms.