A systematic review by Brown University researchers of studies on countering COVID-19 misinformation finds that only 18% included public health measures such as vaccination, and some seemed to give credence to conspiracy theories and other unproven claims.
The review also revealed challenges in studying health misinformation, including a need for more study of measures aimed at stemming video-based misinformation. The team also said inconsistent definitions of misinformation make it hard to evaluate intervention studies.
In the study, published this week in Health Affairs, the authors analyzed 50 articles, primarily from the United States, discussing 119 interventions published from January 2020 to February 2023. They also mapped the indicators used to evaluate the effects of an intervention and the types of misleading information shown to participants in the studies.
The articles included intervention-based peer-reviewed studies, gray literature (eg, Organization for Economic Cooperation and Development papers), conference papers, and preprint studies.
The researchers noted that governments, health officials, and social media platforms have attempted to contain the spread of COVID-19 misinformation but that the success of those efforts is unclear.
Vaccine microchips, saltwater gargling
The most often-tested interventions were passive inoculation (educating people about misinformation before they are exposed to it; 22%), debunking (exposing falsehoods; 18%), user correction (peer-to-peer efforts that expose untruths; 18%), accuracy prompts (15%), and warning labels (8%).
The authors identified 47 unique outcome measures, the most common of which were perceived accuracy of misinformation, willingness to share misinformation, willingness to share factually correct information, sharing discernment, and intent to vaccinate.
Only 18% of studies included public health mitigation measures such as COVID-19 vaccination, and the misinformation that the disease-mitigating measures were tested against included falsehoods (eg, vaccines contain microchips) and spurious claims (eg, gargling with saltwater prevents infection).
The three most common methods of spreading misinformation were a combination of text and images (59%), text only (32%), and video only (4%). The types of misinformation were related to COVID-19 prevention, treatment, and diagnostics (52% of interventions); vaccines (42%); and politics and economics (35%).
The three most-deployed topics of vaccine misinformation were safety, efficacy, and necessity (36%), conspiracy theories (9%), and morality and religion (8%).
By mapping the different characteristics of each study, we found levels of variation that weaken the current evidence base.
Of 19 debunking interventions, 84% improved participants’ beliefs and accuracy judgments. Although their effectiveness was supported by less evidence, warning-label and combined interventions improved beliefs and accuracy judgments in 67% and 60% of the measured instances, respectively. Passive inoculation and user correction had little effect on beliefs or accuracy judgments.
Fifteen of 18 accuracy prompts (83%) improved participants' information-sharing habits. Of 11 user-correction interventions, only 4 (36%) boosted the quality of information shared.
Variables that attenuated the positive impacts of interventions were low levels of threat perception, education, vaccine attitudes, and higher levels of reactance, religiosity, preexisting beliefs in misinformation, and political conservatism also lowered the interventions' effectiveness.
Bad information cost lives
"We found evidence supporting accuracy prompts, debunks, media literacy tips, warning labels, and overlays in mitigating either the spread of or belief in COVID-19 misinformation," the researchers wrote. "However, by mapping the different characteristics of each study, we found levels of variation that weaken the current evidence base."
The researchers noted that misinformation can erode public trust in scientists, which complicates disease-control efforts and will likely negatively affect the ability to respond to public health problems such as climate change and future outbreaks.
"Research has shown that misinformation shared online during the COVID-19 pandemic contributed to people behaving in ways that increased transmission and mortality, such as not wearing masks, forgoing vaccination, or relying on ineffective alternative medicines to treat infection," they wrote. "These dynamics affected public health efforts to protect communities from COVID-19 and ultimately cost lives."
To better understand interventions' effects and make evidence actionable for public health, "the field urgently needs to include more public health experts in intervention design and to develop a health misinformation typology; agreed-upon outcome measures; and more global, more longitudinal, more video-based, and more platform-diverse studies," the authors concluded.