In a year that has seen the largest measles outbreak in the US in more than two decades, the role of social media in giving a platform to unscientific anti-vaccine messages and organizations has become a flashpoint.
In the first study of public health-related Facebook advertising, newly published in the journal Vaccine, researchers at the University of Maryland, the George Washington University and Johns Hopkins University show that a small group of anti-vaccine ad buyers has successfully leveraged Facebook to reach targeted audiences and that the social media platform’s efforts to improve transparency have actually led to the removal of ads promoting vaccination and communicating scientific findings.
The research calls attention to the threat of social media misinformation as it may contribute to increasing “vaccine hesitancy,” which the World Health Organization ranks among the top threats to global health this year. This increasing reluctance or refusal to vaccinate threatens to reverse the progress made in halting vaccine-preventable diseases, such as measles, which has seen a 30% increase in cases globally.
The research team, co-led by UMD’s Dr. Sandra C. Quinn, GW’s Dr. David Broniatowski and JHU’s Dr. Mark Dredze, examined more than 500 vaccine-related ads served to Facebook users and archived in Facebook’s Ad Library. This archive, which became available in late 2018, catalogued ad content related to “issues of national importance.” Their findings reveal that the majority of advertisements (54%) which opposed vaccination, were posted by only two groups funded by private individuals, the World Mercury Project and Stop Mandatory Vaccination, and emphasized the purported harms of vaccination.
“The average person might think that this anti-vaccine movement is a grassroots effort led by parents, but what we see on Facebook is that there are a handful of well-connected, powerful people who are responsible for the majority of advertisements. These buyers are more organized than people think,” said Amelia Jamison, a faculty research assistant in the Maryland Center for Health Equity, and the study’s first author.
In contrast, those ads promoting vaccination did not reflect a common or organized theme or funder, and were focused on trying to get people vaccinated against a specific disease in a targeted population. Examples included ads for a local WalMart’s flu vaccine clinic or the Gates Foundation campaign against polio.
Yet, because Facebook categorizes ads about vaccines as “political,” it has led the platform to reject some pro-vaccine messages. “By accepting the framing of vaccine opponents – that vaccination is a political topic, rather than one on which there is widespread public agreement and scientific consensus – Facebook perpetuates the false idea that there is even a debate to be had,” said David Broniatowski, associate professor of engineering management and systems engineering at GW, and principal investigator of the study. “This leads to increased vaccine hesitancy, and ultimately, more epidemics.”
“Worse, these policies actually penalize pro-vaccine content since Facebook requires disclosure of funding sources for ‘political’ ads, but vaccine proponents rarely think of themselves as political. Additionally, vaccine opponents are more organized and more able to make sure that their ads meet these requirements.”
Facebook is a pervasive presence in the lives of many people, meaning its decisions about how to handle vaccine messaging have far-reaching and serious consequences, said Sandra Crouse Quinn, professor and chair of the Department of Family Science at UMD’s School of Public Health, and a principal investigator on the study.
“In today’s social media world, Facebook looms large as a source of information for many, yet their policies have made it more difficult for users to discern what is legitimate, credible vaccine information. This puts public health officials, with limited staff resources for social media campaigns, at a true disadvantage, just when we need to communicate the urgency of vaccines as a means to protect our children and our families,” said Quinn.
The researchers note that the data gathered for this study from Facebook’s Ad Archive was collected in December 2018 and February 2019, before Facebook’s March 2019 announcement of updated advertising policies designed to limit the spread of vaccine-related misinformation. This study provides a baseline to compare how new policy changes may change the reach of ads from anti-vaccine organizations. Those standards, issued in response to the proliferation of anti-vaccination misinformation that coincided with measles outbreaks across the U.S.in early 2019, include that Facebook will block advertisements that include false content about vaccines and disallow advertisers from targeting ads to people “interested in vaccine controversies,” as they were previously able to do.
Yet, the messengers may simply mutate their messages, virus-like, to avoid the tightening standards. “There is a whole set of ads that focus on themes of freedom’ or ‘choice’ and that elude the Facebook rules around vaccine ads,” Broniatowski said.
Jamison says that the research team will continue to study how anti-vaccine arguments are spreading on Facebook and how the company is responding to demands from public health organizations to clean up its act.
“While everyone knows that Facebook can be used to spread misinformation, few people realize the control that advertisers have to target their message,” said Mark Dredze, a John C. Malone associate professor of computer science at Johns Hopkins. “For a few thousand dollars, a small number of anti-vaccine groups can micro-target their message, exploiting vulnerabilities in the health of the public.”