Long Overdue, YouTube Bans Anti-Vax Activists and Misinformation | New University

YouTube announced the start of efforts to crack down on all anti-vaccine activists and vaccine misinformation in a blog post on Sept. 29. The video-sharing platform will remove content claiming vaccines cause autism, infertility and contain tracking devices. Personal stories and claims about vaccines in the process of being tested are still permitted.

The platform’s initial policy towards COVID-19 has already successfully removed more than 130,000 videos. In the blog post, YouTube addressed how their previous COVID-19 misinformation policy has informed their most recent vaccine decision. 

“We’ve steadily seen false claims about the coronavirus vaccines spill over into misinformation about vaccines in general,” they stated.

In addition to being one of the largest video-sharing platforms — which already has enough weight on its own — YouTube has been identified as a source of significant amounts of vaccine misinformation. When anti-vaccine YouTube videos go viral, they make their way to other large anti-vaccine social media organizations like Facebook and Twitter. In that sense, YouTube is at the center of the social media network through which vaccine misinformation is distributed. 

Given how large of an impact YouTube and other social media corporations have in influencing culture and speech, these platforms have a responsibility to regulate misinformation and encourage truth. For years, researchers have maintained that increased vaccine misinformation and the accompanying discourse on social media leads to increased vaccine hesitancy. A 2020 BMJ Global Health study concluded that there’s a “significant relationship between organizations on social media and public doubts of vaccine safety.” Once fear from misinformation spreads, it is nearly impossible to undo the harm — very rarely do large corporations and media outlets follow up with stories to disprove claims.

While a policy by YouTube banning vaccine misinformation is long overdue, their announcement is still worth celebrating. Their policies will limit the amount of misinformative content and discourage anti-vaccine sentiments, which will hopefully encourage vaccine safety. Further, this announcement suggests that corporate social media is slowly making choices that will foster a more truth-filled culture and society. 

It is not surprising that it took so long for YouTube to finally address the issue of vaccine misinformation. Social media corporations have a tendency to make decisions that are socially or economically profitable. Remaining on the sideline to viral misinformation can increase significant profit for these companies, as these stories often boost viewership. If these corporations were to remove the viral content, they also successively remove any activity or discourse directly related to such content. Ultimately, this takes away opportunities for companies to make profit.

We saw this in Facebook and YouTube’s initial handling of COVID-19 specific misinformation late last year. It wasn’t until after these platforms received wide-criticism and pressure for not regulating the misinformation that they finally acted on removing such content. At that point, COVID-19 anti-vaccine conspiracies had already made their way through the entire social network. It was only when the company’s reputation was hurt that action was truly taken.

Another separate but relevant example is when Twitter and Facebook allowed former President Trump to spread violent, misinformative and discriminatory content — in the name of “free speech” — for the majority of his presidency. Fearing the loss of support and revenue, Facebook and Twitter made fruitless attempts to flag or put warning signs on these posts so they could avoid actually removing their audience. It wasn’t until the Capitol Attack on Jan. 6 that both platforms finally banned him. 

Despite how frustrating it can be to wait so long for social media platforms like YouTube to regulate misinformation, there is no dispute that their newest policy is absolutely necessary and a step in the right direction. Celebrating such a decision will only encourage social media corporations to continue to promote a culture of truth. 

Erika Cao is an Opinion Intern for the fall 2021 quarter . She can be reached at caoea@uci.edu.