In a new paper published in the Harvard Kennedy School Misinformation Review, researchers from the George Washington University, University of Maryland and Johns Hopkins University assessed content from the most active vaccine-related accounts on Twitter and found that even accounts with pro-vaccination views and higher public health credibility can be vectors of misinformation in the highly uncertain and rapidly changing environment caused by the COVID-19 pandemic.
The researchers sought to better understand how existing online communities contributed to an “infodemic” during the early stages of the pandemic. In February 2020, the World Health Organization warned that the growing infodemic – a deluge of both accurate and inaccurate health information – would be a major challenge in sharing effective health communication during the COVID-19 pandemic.
Of the 2,000 Twitter accounts assessed, the researchers found:
- Even well-meaning vaccine proponents shared unreliable information about COVID-19 and vaccines, though at a lower rate than vaccine opponents and other low-credibility sources. According to the researchers, the novel nature of the pandemic meant that emerging data often corrected initial content, making it possible for well-meaning, credible sources to post information that later proved false.
- Vaccine opponents shared the greatest proportion (35%) of unreliable information, including a mix of conspiracy theories, rumors and scams.
- Among both vaccine proponents and vaccine opponents, the largest single topic of conversation was “disease and vaccine narratives,” where users made comparisons between COVID-19 and other diseases, most notably influenza. The researchers noted these messages likely added to public confusion around COVID-19 and the seriousness of the virus and disease.
- Much of the misinformation came from actual people as opposed to bots.
- By focusing on only the most conspicuous forms of misinformation perpetuated by anti-vaccination and other low-credibility sources — such as blatant conspiracy theories, bot-driven narratives and known communities linked by conspiracist ideologies — scholars may fail to address the more subtle types of falsehoods that could be shared more broadly.