conspiracy theory

Inside Twitter’s Marathon-Bombing Failure

Residents flee from an area where a suspect is hiding on Franklin St., on April 19, 2013 in Watertown, Massachusetts.

The first time I was ever fully fixated on Twitter came in the wake of the 2013 Boston Marathon bombings. It was a surreal few days — the murder of an MIT cop, a shootout in the quiet streets of Watertown, a fugitive terrorist found hiding under a boat — and substantive information was hard to come by. All of this was occurring in my hometown, and I was stranded hundreds of miles away at grad school, so like a lot of people I was desperate for any sort of news or context.

Twitter delivered, in a sense: It offered up an incessant running stream of information as everyone tried to figure out what the hell was going on. Much of this information was false, though, and the app ended up serving as a mega-megaphone for a number of false rumors ranging from the relatively benign to the darkly and nonsensically conspiratorial.

A paper presented at last March’s iConference in Berlin (hat tip to Brendan Nyhan on, well, Twitter) examined three of the rumors that flared in the wake of the bombings, combing through a large number of tweets to determine the ratio between the number of times the rumor was tweeted (or retweeted) as fact and the number of times rebuttals were. In each case, the results are bad news for anyone who thinks Twitter doesn’t have a rumor and conspiracy issue:

-The rumor that an 8-year-old girl was killed while running the marathon had 90,668 misinformation tweets and 2,046 correction tweets, for a ratio of 44:1.

-The rumor that Sunil Tripathi, a 22-year-old Brown University student who had gone missing at the time of the marathon, was one of the suspects the FBI identified in grainy photographs it released to the public (he was later found dead) had 22,819 misinformation tweets and 4,485 correction tweets, for a ratio of 5:1.

-The rumor that mysterious individuals wearing attire with Navy Seal, Blackwater, or Craft Security insignias were spotted in the vicinity of the explosions, therefore suggesting the blasts were a false-flag attack perpetrated by the U.S. government, had 3,793 misinformation tweets and 212 correction tweets, for a ratio of 18:1.

(Irresistible aside: Whenever I come across that last one, I can’t help but reflect on how impressively insane a conspiracy theory it is. Alex Jones and the other people who were irresponsibly spreading it almost immediately after the attacks are apparently arguing that, on the one hand, the government had the means and will to carry out a horrific false-flag attack for reasons unknown, but, on the other, it failed to have enough foresight to not send its perpetrators in uniform to the scene of the crime.)

The lesson, as always, is that Twitter’s interface (particularly the ease of retweeting), combined with humanity’s natural tendencies toward rumor spreading and conspiracy theorizing, mean that even though the platform has many benefits, it can also very quickly and very easily degenerate into a tangled, fetid swamp of misinformation — especially in emotionally charged times of uncertainty.