The biggest challenge for public-health officials fighting against anti-vaccine hysteria is just how much bad information is out there. Anyone can sit down, type “vaccine risks” into Google, and wade through the resulting terabytes of paranoia and bad science. And the average person, because they’re busy with a million other responsibilities, isn’t always in a position to separate rigorous facts about vaccination (that is, that the overwhelming scientific consensus is that they don’t cause autism and are quite safe in general) from flimsy “facts.”
A new paper in The Journal of Advertising shows how depressingly easy it is for scientifically sound information about vaccines to get swept away in a maelstrom of misinformation. The authors, led by Ioannis Kareklas of Washington State University, showed study participants two fake online PSAs about vaccination — one pro-, one anti-.
Here’s how the first went down, as explained by the study’s press release.
Participants were led to believe that the pro-vaccination PSA was sponsored by the U.S. Centers for Disease Control and Prevention (CDC), while the anti-vaccination PSA was sponsored by the National Vaccine Information Council (NVIC). Both PSAs were designed to look like they appeared on each organization’s respective website to enhance validity.
The PSAs were followed by comments from fictitious online commenters who either expressed pro- or anti-vaccination viewpoints. Participants weren’t told anything about who the commenters were, and unisex names were used to avoid potential gender biases.
The researchers then gave the participants a questionnaire designed to gauge their opinions about vaccines and the likelihood they’d vaccinate their kids. The results “kind of blew us away,” said Kareklas in the press release. “People were trusting the random online commenters just as much as the PSA itself.” In a follow-up experiment, participants were told the online commenters were either a doctor or a member of one of two other professions. Not surprisingly, they were swayed more by doctor-commenters than by other sorts of commenters.
As with any study about changing people’s opinions in a lab setting, a bit of caution is warranted: There’s no way to know how long these effects lasted, since the researchers didn’t conduct a longer-term follow-up, and even if they had, it would have been hard to separate what the participants saw in the lab from other messages they may have come across later on.
But still: This study is a reminder of the massive canyon between people who work in the scientific community and many of those who don’t. All those ideas of authority and credibility and skepticism that public-health researchers hold dear simply don’t matter on the wildlands of the internet, where there are millions of versions of “the truth” flitting about.