COVID-19 vaccine misinformation is out there and spreading, with potentially dangerous health consequences. Luddy School of Informatics, Computing and Engineering Luddy Distinguished Professor of Informatics Fil Menczer and his fellow researchers at the Observatory on Social Media published a year-long Twitter study on the issue with clear goals -- see how much misinformation is out there, find out who’s doing it, and stop them, saving lives and benefitting society.
Menczer was one of six authors of the paper, “One Year of COVID-19 Vaccine Misinformation on Twitter: Longitudinal Study,” which was published recently in the internationally renowned Journal of Medical Internet Research.
The journal is a leading venue focusing on digital health, data science, health informatics and emerging technologies for health, medicine and biomedical research.
According to the authors, vaccinations are critical in mitigating the impact of COVID-19 and other diseases. Past research by the Observatory indicates that exposure to vaccine misinformation on social media is associated with lower vaccination rates.
“If you’re misinformed about health,” Menczer says, “you might do things that are harmful to your health. If you believe a vaccine is dangerous when it is safe, you might choose to not vaccinate yourself, putting yourself and the people you come in contact with at risk of becoming sick and dying.”
Almost 300 million English-language tweets related to COVID-19 vaccines were analyzed. Researchers identified the sources of each tweet, including those from verified and automated Twitter accounts, and divided them into low-credibility and mainstream news sources. They also considered suspicious YouTube videos posted on Twitter.
Their findings showed a relatively small amount of low-credibility information compared to all of mainstream news, but the most popular low-credibility sources were shared as much as mainstream sources, and more than credible sources such as the US Centers for Disease Control and Prevention, and the World Health Organization.
The amount of low-credibility vaccine news increased during the year, especially YouTube videos. About 800 “superspreaders” generated about 35 percent of the misinformation shares. The top superspreader – Robert Kennedy Jr. -- generated 13 percent of the retweets. Automated accounts were more likely to share low-credibility news and suspicious YouTube videos.
Kennedy is an American environmental lawyer and author known for his anti-vaccination and conspiracy-theory views. He’s a son of the late U.S. senator Robert F. Kennedy and a nephew of late President John F. Kennedy, and is considering running for U.S. president in the 2024 election.
“These accounts spread misinformation that many people are exposed to and reshare,” Menczer says. “When that happens, more people are hesitant, fewer people get vaccinated and more people die.”
The study confirmed misinformation superspreaders are driven by financial gain. The more misinformation they spread, the more they make. The study recommends reducing the online visibility of such spreaders, especially during public health crises.
Menczer says identifying superspreaders and their potential harm could create enough pressure for platforms such as Twitter and Facebook to regulate them.
“Unfortunately, platforms are moving in the opposite direction,” he says. “They are reinstating accounts they had suspended for violating their policies on hate speech and dangerous health misinformation. They are not removing them. Our research shows that’s harmful.”
Another author, Matthew DeVerna, who is one of Menczer’s informatics Ph.D. students, says moderating superspreader content would be easier than moderating all users, but prominent public figures are less likely to be regulated because of a concern over public backlash given their large audience on and off the platform.
“This type of backlash is bad for any social media company’s bottom line,” DeVerna says. “This dynamic creates a systematic misalignment between what’s best for the public and what’s best for social media companies’ incentives.”
Reports also show preferential treatment toward prominent Facebook accounts.
Menczer and his students are working on models to show specific evidence that misinformation is causing people not to get vaccinated resulting in higher infection rates.
“We can find an association between misinformation and falling vaccination rates,” Menczer says, “but a causal link is harder to prove.”
DeVerna says they are developing a website to publicly identify and display the worst superspreaders on Twitter and Facebook each month. He says the hope is to increase pressure on platforms to hold users accountable and provide a useful source of research data.
Researchers can only do so much to fix the problem, Menczer says.
“Policies change when the public and policymakers become aware. Our job as researchers is to highlight and quantify problems. It’s the job of citizens to be informed, act accordingly and talk to their representatives. It’s the policymakers’ job to force or nudge platforms to at least enforce their own policies.”