A Really Important Political Science Study About Gay Marriage Used Faked Data

An anti-gay rally on May 18, 2004 in Los Angeles, California.
Photo: David McNew/Getty Images

Back in December, Science of Us, alongside seemingly every other outlet that covers psychology science, reported that “A 20-Minute Chat With a Gay Person Made People Much More Supportive of Gay Marriage.” It sounds like an overblown headline, but the study in question, published in the top journal Science and co-authored by Michael LaCour, a grad student at UCLA, and Donald Green, a professor at Columbia, backed it up: Researchers found that when canvassers were sent out to engage with California voters on the question of gay marriage, they were able to substantially sway the voters’ opinions toward support for gay marriage, and the effect lasted for (at least) nine months and was contagious within the household. These big effects only took hold, however, when the canvasser him- or herself was gay and revealed that during the conversation; straight canvassers had far less luck.

This was a striking finding for a number of reasons: Most studies that attempt to measure approaches to swaying people on hot-button political issues do so in contrived lab settings, and when they find “successful” approaches, the effect sizes tend to be marginal. Here was a real-world study that showed a shockingly effective approach. There’s a reason it got coverage in the New York Times and a segment on “This American Life.”

But everyone was fooled. The data were faked, says Retraction Watch. There was a quick (by academic standards) and brutal snowball effect here: The site reports that two grad students at UC-Berkeley who were hoping to extend the original findings, David Broockman and Joshua Kalla, noticed certain data irregularities. The duo then tried to get in touch with the survey firm LaCour and Green used for their study, but the firm “claimed they had no familiarity with the project and that they had never had an employee with the name of the staffer we were asking for,” they write on a timeline of the events that they’ve posted here. Eventually, Green told the grad students “that LaCour has been confronted and has confessed to falsely describing at least some of the details of the data collection.” Green subsequently asked Science to retract the paper. LaCour was slated to start as a professor at Princeton in July, but Retraction Watch notes that a mention of him has already been scrubbed from the university’s website.

We’re going to have to wait for more details to find out how LaCour was able to fool so many people, but this is going to go down as a major academic scandal. Part of the problem is that it fit rather neatly into other research on persuasion and political psychology: Psychologists and political scientists have known forever that simply giving people facts about an issue won’t usually persuade them, that different, more personal and emotionally compelling approaches are better. The California canvassing seemed to be exactly this sort of approach — there are heartbreaking scenes from the “This American Life” segment, in which a reporter tagged along with canvassers, of just how intimate these conversations were.

The scandal certainly casts a different light on what Andrew Gelman, a statistician at Columbia, wrote about the results back when the study was released:

A difference of 0.8 on a five-point scale … wow! You rarely see this sort of thing. Just do the math. On a 1-5 scale, the maximum theoretically possible change would be 4. But, considering that lots of people are already at “4” or “5” on the scale, it’s hard to imagine an average change of more than 2. And that would be massive. So we’re talking about a causal effect that’s a full 40% of what is pretty much the maximum change imaginable. Wow, indeed. And, judging by the small standard errors (again, see the graphs above), these effects are real, not obtained by capitalizing on chance or the statistical significance filter or anything like that.

And this got me wondering, how could this happen? After all, it’s hard to change people’s opinions, even if you try really hard. And then these canvassers were getting such amazing results, just by telling a personal story?

Wow, indeed.

A Really Important Study Used Faked Data