At the moment, we are so deep in the fake-news morass that it’s hard to even envision a way out. One possibility, though, is old-school media literacy: Teach people the importance of only believing and sharing stories from high-quality news sources. After all, while people don’t trust “the media” in general, polls suggest Americans actually do trust major outlets like ABC, NBC, CBS, and The Wall Street Journal — in fact, only a quarter of Americans say they don’t trust those outlets. In theory, if you could better explain to Americans which outlets not to trust, that would help clean things up online.
Unfortunately, that probably wouldn’t work. In a new study, “Who Shared It?: How Americans Decide What News to Trust on Social Media,” a group of researchers show the rather alarming extent to which Americans ignore the source of a given claim, focusing way more on the trustworthiness of the person sharing it. For most people, it seems, that question is way more salient than whether the latest revelation about Hillary Clinton came from CNN versus PrisonPlanet.
For the study, the Associated Press teamed up with the American Press Institute and the independent research organization NORC at the University of Chicago to conduct an experiment on a nationally representative sample of 1,489 Americans. Participants were presented with a Facebook-like social-media post of an article headlined, “Don’t let the scale fool you: Why you could still be at risk for diabetes,” as well as the text of the article itself. Sometimes the story appeared to be published by the AP; sometimes, by a made-up organization called DailyNewsReview.com. The other variable was who shared the article: Sometimes, it appeared to have been shared by a public figure — ranging from Oprah to Dr. Oz to the surgeon general — the participant had previously identified as trustworthy; other times, by one they had previously identified as not trustworthy. After they read the article, the respondents answered a bunch of questions about whether the article got the facts right, expressed diverse points of view, was entertaining, or contained easy-to-find and trustworthy information, and then revealed how likely they would be to share it.
The key finding here is that “people who see an article from a trusted sharer, but one written by an unknown media source, have much more trust in the information than people who see the same article from a reputable media source shared by a person they do not trust.” Overall, the quality of the sharer mattered way more than the quality of the outlet, though there were exceptions, just as when a respondent was already predisposed to distrust CNN.
One clear takeaway from this study has to do with user design. It could be that social-media platforms genuinely concerned about fake news need to do a better job emphasizing the importance of source rather than sharer. At the moment, many don’t do this. Facebook is a particularly bad culprit, argued the academic and tech writer Mike Caulfield in a postelection November blog post. As he wrote, “The headlines that float by you on Facebook for one to two hours a day, dozens at a time, create a sense of familiarity with ideas that are not only wrong, but hateful and dangerous. This is further compounded by the fact that what is highlighted on the cards … is not the source of the article, which is so small and gray as to be effectively invisible, but the friendly smiling face of someone you trust.”
Here, Caulfield includes a screengrab of a Facebook post to show what he means, but I’ll grab one from my own feed to make the same point:
I’m blacking out the names, but you can see that the identity of the sharer gets way more real estate than the source of the article. Facebook, in an attempt to get you to share more while remaining as neutral as possible on that whole “What is truth anyway, man?” thing, is — in effect — saying the loud part quietly and the quiet part loudly: “YOUR FRIEND BOB, WHO YOU TRUST A LOT AND GO TO CHURCH WITH, SAYS HILLARY CLINTON RUNS A SEX-TRAFFICKING OPERATION … (according to PatriotTruthNews.net).”
Which brings us to the broader point. Technically, this is a study about social media, but really it’s a study about how people decide which beliefs strike them as credible. And here it fits in neatly with a long line of research from political psychology, sociology, and other fields that keeps repeating, over and over, Social ties matter a lot to how people form their beliefs. As Caulfield put it in his blog post, “for the most part, our brains equate ‘truth’ with ‘things we’ve seen said a lot by people we trust.’”
This is a lesson that is easy to forget if for some reason — maybe because you get paid to write about politics online — you view the world as an endless debate club in which each side is endlessly trying to convince the other through the sheer facty strength of their logic. But in reality, people tend to operate from a more gut-oriented place than that — they believe claims that feel like the sort of claims they should believe, or claims that come from people who are similar to them. These psychological tendencies stretch pretty far — all the way to the study of fringe religious movements and cults, in fact.
Now, in this particular case, the sharers were big-name celebrities, but the research suggests the same logic would apply to news shared by members of one’s social network. That is, if you’re the average American news consumer, and you see your friend Bob — who, again, you have spent a lot of time with, and who you know to be an honest and decent guy who shares your values — post an article on Facebook, it makes sense you’re going to view that article in a favorable light, regardless of the source. It actually isn’t particularly easy to separate credible from questionable news sources, and if you haven’t had any training or experience on that front, a Bob endorsement seems like as reasonable and convenient a proxy for article quality as anything else on social media.
At first glance, this new study would appear to be another strike against fact-checking. If Bob keeps sharing the story long after it’s been debunked — and we know this happens — people will still believe it. They look to the Bobs of the world for guidance, not to some distant arbiter of truth who claims authority, but who likely has an agenda of its own. (Paul Joseph Watson, just a few days ago: “Snopes is a bias [sic], far-left outfit. It is not a responsible ‘fact-checker.’”)
Luckily, the story is a bit more complicated than that: Recent research — albeit research conducted in fairly controlled settings, rather than the Wild West of the internet — suggests fact-checking isn’t as futile as some have made it out to be.
Overall, though, this study and a bunch of others emphasize the need to hammer home a simple point: When it comes to fake news, one’s identity — and the values and preferences that identity represents — matter way more than just about anything else. Any attempt to fix the fake-news problem needs to take this into account.