The Problems With Facebook’s Polarization Study

Yesterday, Facebook released a study in Science that pushed back on the idea of the “filter bubble“: that social media creates a kind of echo chamber in which users never see arguments from the other side, helping to insulate those users from substantive political debate. Taken to its extreme, the Filter Bubble might almost completely close users off to new ideas and information, leaving them in a digital world where their viewpoints go ever unchallenged — and contributing to political polarization.

Facebook Use Polarizing? Site Begs to Differ,” the New York Times headline read. “You would think that if there was an echo chamber, you would not be exposed to any conflicting information,” a data scientist who worked on the study said, “but that’s not the case here.”

But looking at the study, I came to very different conclusions. It absolutely shows that the “filter bubble” exists among some users, and that Facebook and its algorithms play a significant role in creating that bubble. But I can only make that claim about a small number of users that are likely not at all representative of the broader Facebook population, because Facebook relied on such an unusual sample of its users. In other words, despite the buzz this study is getting, we still don’t have a very good sense of how Facebook and other social-media services might or might not contribute to polarization.

As pointed out by Nathan Jurgenson, the study only looks at people who self-identify their political orientation on the site. That means it only examines 9 percent of users, a number you’d only see if you read through the appendix, he notes. It also only looks at Facebook users who log in four to seven days a week, and who meet a few other criteria as well. That nudges the proportion of users examined in the study down to just 4 percent, or about 10 million users.

Are those 4 percent of users representative of Facebook’s user base as a whole? Well, they aren’t randomly sampled. We know that identifying your political affiliation is a fairly rare behavior, given that fewer than one in ten users bothers to do so. My guess is that those users are much more politically engaged and likely much more partisan than the average user, and that’s probably going to affect what they click on and whom they are friends with. But how and how much that matters, I cannot say.

There could also be something different about those Facebook users who log in frequently — maybe they post more or leave more comments. Again, we don’t know. But the point is that if you want to make any broad conclusions about a big population based on a study, you need a random, representative sample. You can’t survey three rich guys in Greenwich and declare that “America’s” favorite food is caviar.

Social-media experts and data scientists are taking Facebook to task for not making all this clearer. “At first, I read this quickly and I took this to mean that out of the at least 200 million Americans who have used Facebook, the researchers selected a ‘large’ sample that was representative of Facebook users,” writes Christian Sandvig of the University of Michigan. “The ‘limitations’ section discusses the demographics of ‘Facebook’s users,’ as would be the normal thing to do if they were sampled. There is no information about the selection procedure in the article itself.”  

But even setting aside the sample issues, the study clearly does not show that those unusual users are exposed to a diverse set of viewpoints, nudged along by the Facebook algorithm. It shows that they see a fairly skewed set of viewpoints, with the Facebook algorithm contributing to the skew. Facebook filters out about 1 in 20 “cross-cutting” hard-news stories for conservatives and about 1 in 13 “cross-cutting” hard-news stories for liberals.

Facebook attempts to distance itself from filter-bubble accusations by noting that individuals isolate themselves, too, with self-identified conservatives clicking on 17 percent fewer “cross-cutting” news stories than would be expected if they clicked at random, and liberals, 6 percent fewer. The company used that finding to argue that it plays less of a role than individuals themselves. But that’s only true for conservatives — not for liberals. 

And it’s not clear that Facebook can or should be arguing that it plays a smaller filtering role than individuals, given how the study was conducted in the first place and given that the two findings do not seem directly comparable. “I cannot remember a worse apples to oranges comparison I’ve seen recently, especially since these two dynamics, algorithmic suppression and individual choice, have cumulative effects,” writes Zeynep Tufekci of the University of North Carolina.

No, the filter bubble feels like a very real phenomenon, and Facebook has just shown that for some users, it contributes to it. On social media, we hear what we want to hear, see what we want to see, and click what we want to click. Don’t let Facebook tell you otherwise. 

The Problems With Facebook’s Polarization Study