Report spotlights 52 US doctors who posted potentially harmful COVID misinformation online

Conspiracy theories on phone

Arkadiusz Warguła / iStock

Two new studies describe a couple sources of the COVID-19 "infodemic" on social media: US physicians and proponents and practitioners of "doing your own research."

Vaccine untruths, conspiracy theories

A mixed-methods study published yesterday in JAMA Network Open finds that 52 physicians practicing in 28 different specialties across the United States propagated COVID-19 misinformation on vaccines, masks, and conspiracy theories on social media and other online platforms from January 2021 to December 2022.

Researchers at the University of Massachusetts at Amherst used Centers for Disease Control and Prevention (CDC) guidelines on COVID-19 prevention and treatment to define misinformation. They also performed structured searches of high-use social media platforms (Twitter, Facebook, Instagram, Parler, and YouTube) and news outlets (the New York Times and National Public Radio) to identify physician-communicated misinformation.

Twitter was the most common platform, where 71.2% of the doctors spread misinformation and had a median of 67,400 followers.

All 52 physicians who spread misleading COVID-19 information were or had been licensed to practice medicine in the United States except for two, who were researchers, and nearly a third were affiliated with groups with a history of spreading medical misinformation, such as America's Frontline Doctors. The most common specialty was primary care, at 36%.

Of the 52 physicians, 80.8% posted false vaccine information, 76.9% passed on more than one type of misinformation in more than one category, 38.5% posted falsities on at least five platforms, and 76.9% appeared on five or more third-party online platforms such as news outlets. Twitter was the most common platform, where 71.2% of the doctors spread misinformation and had a median of 67,400 followers.

Major themes were disputing COVID-19 vaccine safety and effectiveness, promoting non–evidence-based medical treatments or those lacking Food and Drug Administration (FDA) approval for this indication (eg, ivermectin, hydroxychloroquine), disputing mask effectiveness, and other unproven claims on topics such as the origin of SARS-CoV-2, government coverups, drug company profit motivations, and other conspiracy theories. Many posts were based on patient anecdotes and data from low-quality medical journals.

Promoting fear and distrust of the vaccine and reliance on "natural" immunity were frequent subthemes. Examples of the unfounded claims were that the vaccine causes infertility, permanently damages the immune system, and increases the risk of chronic disease for children and overstated the risk of myocarditis (inflammation of the heart muscle).

"A common approach included circulating counts of positive case rates by vaccination status, claiming that most positive cases were among vaccinated individuals," the researchers wrote. "This claim is technically true but misleading, as many more people are vaccinated, and the proportion of unvaccinated people who are infected is much higher."

Falsehoods may have contributed to a third of US deaths

The authors predicted that the elimination of safeguards against misinformation on Twitter (now X) and the absence of federal laws regulating medical misinformation on social media will lead to the persistence—or even an increase in—the spread of non–evidence-based information.

The authors said that while medical misinformation was spread long before the COVID-19 pandemic, the internet boosts the reach and speed of dissemination, potentially exacerbating the consequences.

They noted that about a third of the more than 1.1 million reported COVID-19 deaths in the United States as of January 18, 2023, were considered preventable if public health recommendations such as vaccination and physical distancing had been followed.

"COVID-19 misinformation has been spread by many people on social medial platforms, but misinformation spread by physicians may be particularly pernicious," the authors wrote. "This study's findings suggest a need for rigorous evaluation of harm that may be caused by physicians, who hold a uniquely trusted position in society, propagating misinformation; ethical and legal guidelines for propagation of misinformation are needed."

'Do your own research' rooted in conspiracy theories

A study conducted by two University of Wisconsin (UW) and University of Michigan (UM) researchers suggests that promotion of "doing your own research" (DYOR) rather than relying on evidence-based COVID-19 information may reflect anti-expert attitudes instead of beliefs about the importance of critically evaluating data and sources.

The research was published in the Harvard Kennedy School's Misinformation Review.

COVID-19 misinformation has been spread by many people on social medial platforms, but misinformation spread by physicians may be particularly pernicious.

In December 2020 and March 2021, the investigators analyzed data from a YouGov panel of about 1,000 DYOR proponents who, after the researchers controlled for type of media consumed, grew more distrustful and misinformed even as news of successful COVID-19 vaccine trials emerged—although their COVID-19 concerns didn't dissipate. About a third of respondents had at least a bachelor's degree.

The investigators found that people who supported DYOR were likely to distrust scientists and instead believe COVID-19 misinformation.

The analysis found that DYOR explained only about 1% of the variance in both trust in science and COVID-19 misperceptions after controlling for previous levels of dependent variables, suggesting that its potential effects are small but may accumulate over time. 

In a UW press release, UW coauthor Sedona Chinn, PhD, said she and UM coauthor Ariel Hasell, PhD, often heard the phrase "do your own research" even before the pandemic "coming from a lot of online, anti-vaccine rhetoric."

DYOR initially gained popularity in the 1990s as a slogan of Milton William Cooper, who promoted conspiracy theories on topics such as UFOs, the assassination of President John F. Kennedy, and the AIDS epidemic. The movement picked up steam in the 2010s with anti-vaccine activity.

Political rather than scientific goals

Use of the phrase grew quickly starting in 2020, Chinn said, "popularized by Q-Anon and other conspiratorial groups, in more extreme and more dangerous ways. Now, we're following what seem more like connections to certain political views than calls for more and better scientific research."

The researchers also both knew people "who occasionally do weird, unproven stuff, typically around health," Chinn said. "It's not like they reject doctors and medical expertise, but they think their opinion can be equally valid if they do their own research."

Chinn said that encouraging people to DYOR is otherwise generally good advice. "There's a lot of research showing that people who do more information seeking about politics are more civically engaged, and people who do more information seeking about their health conditions have better treatment outcomes," she said.

This week's top reads