bullshit

Why People Believe Meaningless Bullsh*t

Portrait a woman with wreath of lamps
Photo: Dmitriy Larin/Getty Images

Bullshit, as the philosopher Harry Frankfurt defined it in his influential 2005 book On Bullshit, isn’t the same thing as a lie. When a lie is told, the speaker knows it to be untrue. When bullshit is spewed, on the other hand, the speaker simply doesn’t care if it’s true. So if I were to tell you, right now, “I’m typing this article up on a typewriter rather than a laptop,” that would be a lie. If I were to say something like, “Some of the components in my computer were manufactured in Taiwan,” that would qualify as bullshit because I didn’t check beforehand and didn’t care about the veracity of the statement before uttering it.

We are currently in a golden age of bullshit. The internet is awash with unchecked claims. One realm of the discourse that’s particularly conducive to this type of nonsense is the world of spirituality, alternative medicine, and the like. Poke around there, and you’ll see a lot of claims — “Quantum consciousness!” “Like cures like!” — that seem to be made without a great deal of regard for their truth value.

In a recent study in Judgment and Decision Making, a team led by Gordon Pennycook, a Ph.D. student in psychology at the University of Waterloo, dug into (sorry) one particularly common type of bullshit: “pseudo-profound bullshit,” or “seemingly impressive assertions that are presented as true and meaningful but are actually vacuous.” Pseudo-profound bullshit is all over the place these days, especially in those less rationality-based corners of the discourse.

Over the course of four studies, Pennycook and his colleagues asked hundreds of people, both students and Amazon Mechanical Turk workers, to rate a variety of statements on a “profundity scale” from 1, or not at all profound, to 5, or very profound. Some of the statements were drawn from wisdomofchopra.com, a humorous website that randomly generates profound-sounding but meaningless insights designed to sound like the words of the hugely popular and not exactly fact-obsessed spiritual Deepak Chopra (“A single particle is the ground of incredible actions”), others were culled from the vaguest (as judged by the researchers) of Chopra’s actual tweets, and yet others were included as examples of statements that actually were viewed by the researchers as at least a little bit profound and meaningful — “A river cuts through a rock, not because of its power but its persistence,” for example.

The researchers turned the results into a Bullshit Receptivity scale (BSR) that gauges an individual’s propensity to believe in bullshit — the more profoundly a respondent judged meaningless statements to be, the higher their BSR. They also collected data about respondents’ intelligence (in the form of brief tests designed to gauge math and verbal skills), beliefs in the paranormal, religiosity, and certain other characteristics.

Summarizing the results, Pennycook, et al., write that “[people] more receptive to bullshit are less reflective, lower in cognitive ability (i.e., verbal and fluid intelligence, numeracy), are more prone to … conspiratorial ideation, are more likely to hold religious and paranormal beliefs, and are more likely to endorse complementary and alternative medicine.”

This makes some degree of intuitive sense — to the extent all these characteristics can be swept under the same umbrella, that umbrella would be marked “not very skeptical.” Those who don’t instinctively question the information presented to them might see a statement like “Each of us gives rise to existential bliss” — yes, I generated that one using wisdomofchopra.com — and think, Whoa, that’s pretty deep. Those who are more skeptical, on the other hand, are more likely to, well, call bullshit.

Where things get really complicated — and less lighthearted — is the question of how beliefs in pseudo-intellectual bullshit relate to beliefs in dangerous or misleading medical or scientific beliefs. It makes sense on its face that someone who believes in the wisdom of Chopra might also be more vulnerable to, say, a viral website propagating an unproven claim about using coconut oil to treat Alzheimer’s, but there’s just no way to know for sure without doing more research.

Pennycook said it’s unclear exactly how to get people to stop believing in pseudo-profound bullshit. “I usually emphasize the importance of critical thinking, but this isn’t a novel insight,” he said in an email. “We can’t do much to increase our intelligence, but we can make an effort to be more reflective about the information that we come across on a day-to-day basis. Naturally, given the sheer quantity of information that we come across nowadays, it might be necessary to be particularly diligent when it comes to things of particular import (e.g., that relate to health and well-being).”

If there is a straightforward way to sway people from being this credulous, it will probably involve nudging them out of so-called “System 1” thinking, which involves quick, gut-level decisions, and into “System 2” thinking, which involves more careful, deliberate reflection. What if everyone made a habit of identifying those writers or thinkers they most agreed with and every so often took a statement or article or book of theirs and said, “How would I debunk this if I were skeptical about it?”

Basically, the goal here should be to get people to slow down and more carefully examine the information being presented to them. To scan it, in other words, for bullshit.