Facebook's removal of some COVID-19 vaccine misinformation didn't drive down user engagement with the content—likely because the social media platform's architecture allowed users to view and interact with it and let groups boost each other's content or repost deleted posts, suggests an analysis led by George Washington University researchers.
For the study, published today in Science Advances, researchers used the CrowdTangle content-monitoring tool to search for COVID-19 vaccine-related Facebook postings. They compared data from more than 200,000 posts to pages and groups created from November 15, 2019, to November 15, 2020, with posts to the same pages and groups from November 16, 2020, to February 28, 2022.
On November 18, 2020, Facebook announced its removal of "Stop Mandatory Vaccination," one of its largest anti-vaccine fan pages. On December 3, it announced that false claims about COVID-19 vaccines and accounts that repeatedly posted them would be deleted. Five days later, it broadened its policy to include vaccine misinformation in general. The researchers said the study is the first to scientifically evaluate the effects of Facebook's policy.
Policy shifted—not decreased—engagement
By February 28, 2022, Facebook had removed 49 pages (76% of them containing anti-vaccine content) and 31 groups (90% anti-vaccine). Anti-vaccine pages and groups were 2.1 times more likely than their pro-vaccine counterparts to have been removed. Five anti-vaccine groups (5%) changed their settings from public to private.
Posts on both anti-vaccine and pro-vaccine pages declined. Anti-vaccine post volumes fell 1.5 times more than pro-vaccine volumes (relative risk [RR], 0.68), and posts on anti-vaccine group pages also fell. Anti-vaccine group post volumes declined 3.6 times more than their pro-vaccine counterparts.