">

Influencers played outsized role in pushing anti-vax conspiracies

Influencers played outsized role in pushing anti-vax conspiracies thumbnail

A new report from a broad range of disinformation experts finds influencers across many topics — wellness, politics and religion — were largely responsible for spreading viral anti-vaccination content in the U.S. over the past two years.

Why it matters: Influencers with large followings often introduced new and personal angles to familiar anti-vax tropes, making it difficult for social media companies to moderate their comments without running into free speech issues.

Details: The report from the Stanford Internet Observatory, Graphika and several universities and institutes, found that the same subset of recurring actors, including political leaders and celebrities, were often the ones most effective at spreading misinformation widely.

  • Many relied on tropes tied to other conspiracies, like QAnon, or religious narratives to seed new falsehoods about the COVID-19 vaccines.
  • One example includes a tweet from Rep. Marjorie Taylor Greene (R-Ga.) and an interview from Kanye West that baselessly argue the vaccine would be the “Mark of the Beast,” or a biblical reference to a societally restrictive scar imposed on people by the Antichrist.

How it works: Analysts logged examples of vaccine-related misinformation as “tickets” that they then grouped together based on the narratives they used. They then measured the average weekly engagement (comments, retweets, likes, etc.) with those narratives.

By the numbers: Overall, they found that roughly one-third of anti-vax messages analyzed in the study revolved around false claims that the COVID-19 vaccines are unsafe.

  • About 20% of the messaging centered on vaccine development and distribution (for example, vaccination infringing on an individual’s “health freedom” and distrust of companies that make vaccines).
  • Another 20% claimed vaccines were ineffective or unnecessary, and 20% focused on conspiracy theories about the vaccines.

The intrigue: Conspiracy theory influencers generated the largest share of engagement — 37% — across all anti-vax messaging throughout the study.

  • One reason for this could be that those influencers tend to live on apps where users are more committed and passionate about anti-vaccination as a topic.
  • Alternative social media platforms like Rumble, Gab, BitChute, and Righteon, as well as encrypted messaging apps like Telegram are popular with conspiracy theory influencers, as many have been de-platformed by mainstream apps.

Another reason conspiracies tend to engage people is because they are novel, said Renee DiResta, a principal researcher for the study and research manager at the Stanford Internet Observatory.

  • “There is a common core that feels familiar — a trope like bad guys have done a thing — but there is something novel about it that makes people sit up and pay attention,” DiResta said.

The big picture: The report suggests the U.S.’ biggest adversaries, including China, Russia and Iran, played a role in spreading anti-vaccination messages in the U.S.

  • Their involvement, which has been previously reported, suggests they see public health crises as an opportunity to sow discord and confusion in an attempt to undermine American democracy.

Between the lines: Domestic anti-vaccine influencers have a greater reach, and likely impact, on spreading falsehoods about COVID vaccines in the U.S.

  • “They are real and enjoy a large following and the trust of their audience. [That] makes them a far better messenger than a state media overtly making a claim,” DiResta said.
  • Because influencers drew from anti-vaccination narratives that pre-dated COVID, the report suggests public health experts, policy officials and social media platforms could better combat disinformation in the future by preemptively debunking — or “pre-bunking” — anti-vaccine rumors ahead of any future public health crises.

What to watch: The report authors recommend public health experts counter and address themes and tropes of misinformation rather than fact-checking individuals, and use personal stories — not data alone — for counter-messaging.

Read More

Exit mobile version