politics

Even People Who Know Better Fall for Lies If They Hear Them Enough

1930s BARKER IN CHECKERED...
Photo: H. Armstrong Roberts/ClassicStock/Getty Images

The claim that “thousands” of Muslims in New Jersey celebrated after the terrorist attacks of September 11, 2001, has been disputed many times, and yet this hasn’t stopped Donald Trump from repeating it. Just this weekend on Meet the Press, Trump said: “I saw it on television. So did many other people. … And many people saw it in person. I’ve had hundreds of phone calls to the Trump Organization saying, ‘We saw it. It was dancing in the streets.’ I’m not going to take it back.” 

And maybe you have even read the many articles debunking this claim, but there’s still a problem here: If a statement is repeated often enough, it has a funny way of starting to seem true. This is something researchers call the illusory truth effect, first identified in 1977 by scientists from Villanova University and Temple University and since then replicated in many more recent studies, inside and outside the lab.

It had been largely assumed, however, that this would really only work for subjects in which people had no prior knowledge — that is, you can’t trick someone into believing, say, the largest ocean on Earth is the Atlantic when they know it’s actually the Pacific. But, to the contrary, a recent paper published in the Journal of Experimental Psychology found that the illusory truth effect even applies to those who really should know better.

In one experiment, researchers — led by Lisa Fazio of Vanderbilt University — asked a group of undergrads to read through a list of sentences, some of which were obviously true, some of which were obviously false, and others that were trickier. For example: “The Pacific Ocean is the largest ocean on Earth.” (True.) “A date is a dried plum.” (False.) “Oslo is the capital of Finland.” (Also false, but depending on your knowledge of world capitals, perhaps not obviously so.) Next, the students were given another set of statements, but this time, they were to rank each one on a scale of 1 to 6, with 1 meaning definitely false and 6 meaning definitely true. And last, they answered multiple-choice questions that corresponded to the statements they’d just read.

Over at Pacific Standard, writer Tom Jacobs breaks down their results:

The researchers found that repeated falsehoods were more likely to be accepted as accurate, “regardless of whether stored knowledge could have been used to direct a contradiction.” To put it more bluntly: “Repetition increased perceived truthfulness, even for contradictions of well-known facts. … Reading a statement like ‘A sari is the name of the short plaid skirt worn by Scots’ increased participants’ later belief that it was true,” Fazio and her colleagues write, “even if they could correctly answer the question ‘What is the name of the short pleated skirt worn by Scots?’”

Fazio and her colleagues replicated this finding in a second experiment, and in their paper, they nod to research done by others that has pointed at a potential reason why. “Recent work suggests that the ease with which people comprehend statements (i.e. processing fluency) underlies the illusory truth effect,” they write. “Repetition makes statements easier to process (i.e. fluent) relative to new statements, leading people to the (sometimes) false conclusion that they are more truthful.” In an email to Science of Us, Fazio agreed that these results may very well apply to statements made by politicians, who, after a while, may even begin to believe themselves. “Trump is likely to believe the claim because he’s heard it so often, and the public is likely to start to believe it because Trump keeps repeating it,” Fazio said.

In short, MTP host Chuck Todd summed it up nicely on Sunday: “Your words matter. Truthfulness matters. Fact-based stuff matters.” Indeed it does.

Why Even People Who Know Better Fall for Lies