
Researchers from UN Global Pulse, the United Nations' innovation lab, uncover no strong evidence that the YouTube video-sharing platform promoted anti-vaccine content during the COVID-19 pandemic.
"However, the watch histories of users significantly affect video recommendations, suggesting that data from the application programming interface or from a clean browser do not offer an accurate picture of the recommendations that real users are seeing," the researchers wrote of the findings, published late last week in the Journal of Medical Internet Research.
The team asked World Health Organization (WHO)-trained participants and workers from the Amazon Mechanical Turk crowdsourcing tool to find an anti-vaccine video in the fewest clicks, starting from a WHO COVID-19 video. They compared the recommendations the users saw to related videos and recommended "up-next" videos seen through clean browsers without tracking software.
YouTube has stated its commitment to removing content that contains misinformation on vaccination. Nevertheless, such claims are difficult to audit.
Using machine-learning methods, the researchers then identified anti-vaccine content among 27,074 video recommendations.
Promoted anti-vaccine content stayed under 6%
There was no evidence that YouTube funneled users toward anti-vaccine content, with the average proportion of such videos staying below 6%. Instead, YouTube's algorithms pointed users to non–vaccine-related health content.
"The videos that users were directed to were longer and contained more popular content, and attempted to push a blockbuster strategy to engage users by promoting other reliably successful content across the platform," lead author Margaret Yee Man Ng, PhD, said in a University of Illinois at Urbana-Champaign (UIUC) news release. Ng is also a journalism professor at UIUC.
The authors noted that there have been significant concerns about the contribution of social media to vaccine hesitancy throughout the pandemic. For example, Facebook removed some COVID-19 misinformation starting in 2020, but a recent study revealed that users just found other ways to access and spread it.
"YouTube has stated its commitment to removing content that contains misinformation on vaccination," the researchers wrote. "Nevertheless, such claims are difficult to audit. There is a need for more empirical research to evaluate the actual prevalence of antivaccine sentiment on the internet."