bad science

The University of Maryland Has a Burgeoning Chocolate-Milk Concussion Scandal on Its Hands

Photo-Illustration: Photo: Patrick Klok/Creative Commons

On December 22, the University of Maryland published a remarkable press release about some research it had conducted. According to the release, a study conducted by a professor at the UMD School of Public Health had shown that a product called Fifth Quarter Fresh — basically, a fancy, fortified chocolate milk — “helped high school football players improve their cognitive and motor function over the course of a season, even after experiencing concussions.” 

Given the current focus on youth concussions, it’s no surprise that this news traveled fast and that the claim appears to have benefited the company in question. Motivated by what appeared to be sturdy scientific evidence, a rural school superintendent in Maryland said he’s “planning to buy $25,000 worth” of the stuff next year to help protect his kids, reported STAT News. The claim also caught the eye of an editor at HealthNewsReview.org, who asked Andrew Holtz, a writer there, to dig into the details. So he did something science journalists do every day: He reached out to Maryland to get a copy of the study the press release was written off of.

The release in question, published December 22, said it was pegged to a study by Dr. Jae Kun Shim, “a professor of kinesiology in the School of Public Health.” It had to do with cognitive performance, concussions, and FQF. Shim was very enthusiastic about what he had discovered in his study: “High school football players, regardless of concussions, who drank Fifth Quarter Fresh chocolate milk during the season, showed positive results overall,” he was quoted as saying. “Athletes who drank the milk, compared to those who did not, scored higher after the season than before it started, specifically in the areas of verbal and visual memory.” The details were sparse, but the implication seemed to be that the drink could help with concussion recovery. (Shim is out of the country and hasn’t returned two emails I sent him.)

Holtz wanted to know if the study matched the hype, but quickly found himself stymied. As he wrote on January 11, Eric Schurr, a communications staffer at the University’s Maryland Industrial Partnerships (MIPS) program, told him that, whoops, actually there was no study. “Not only wasn’t this study published, it might never be submitted for publication,” Holtz wrote. “There wasn’t even an unpublished report they could send me.” The next day, Schurr sent Holtz an email changing his tune somewhat — it read, in part, “Since this is a preliminary study, we have learned it will make more sense to speak with you once there are more conclusive research results.”

This is exceptionally unusual, and things have only gotten weirder as the details of both this study — really, “study” — and its consequences have emerged. Holtz’s back-and-forth with the university kicked off what is quickly becoming a genuine scandal in College Park — a scandal that touches on vital issues of scientific ethics, the collision of money and research, and the lightning-quick pace at which pseudoscience can lead vulnerable people astray. And it all boils down to a simple question: How the hell could the University of Maryland have allowed this to happen?

The study in question originated with MIPS, which, its website notes, “promotes the development and commercialization of products and processes through industry/university research partnerships. MIPS provides matching funds to help Maryland companies pay for the university research. Projects are initiated by the companies to meet their own research and development goals.” The basic idea appears to be that it connects entrepreneurs with academic researchers who can help them develop their products. Companies working with MIPS also contribute funding, on a sliding scale, toward studies of their own products. This has been reported elsewhere, but Crystal Brown, a spokeswoman for the university, confirmed to me via email that FQF contributed 10 percent of the total cost of the research. This seems to be alluded to vaguely in the press release: “The University of Maryland study was made possible by the Maryland Industrial Partnerships (MIPS) program, which jointly funds commercial product development projects teaming Maryland companies with University of Maryland faculty.”

A situation in which a glowing study of a product is partially funded by that product’s manufacturers is never a good look, but in this case things are particularly shady. The press release’s summation of the study’s findings point to deal-breaking, get-you-flunked-out-of-stats-101 methodological problems:

Experimental groups drank Fifth Quarter Fresh after each practice and game, sometimes six days a week, while control groups did not consume the chocolate milk. Analysis was performed on two separate groups: athletes who experienced concussions during the season and those who did not. Both non-concussed and concussed groups showed positive effects from the chocolate milk.

Non-concussed athletes who drank Maryland-produced Fifth Quarter Fresh showed better cognitive and motor scores over nine test measures after the season as compared to the control group.

Concussed athletes drinking the milk improved cognitive and motor scores in four measures after the season as compared to those who did not.

The remaining test scores did not show a statistically significant difference between the experimental and control groups over the season, according to Shim.

Comparing those who drank the product to those who did not is virtually meaningless. Maybe something about drinking milk, any milk, helps explain the scores. Maybe drinking anything does. Maybe the kids who were instructed to drink milk at a specific time took their overall health seriously in a way that produced these outcomes without the milk doing anything at all. When you conduct a comparison this simplistic, a million maybes — or potentially confounding variables, as researchers would call them — are released into the atmosphere. Under normal circumstances, it’s the researcher’s job to convince his or her peers, and the public, that there was a maybe-containment mechanism in place.

My attempts to get a copy of whatever it was that led to this report originally met the same fate as Holtz’s. In a phone conversation with Brown, first she told me she didn’t want to send it because explaining it required expertise she didn’t have. No problem, I replied. Send it over, and I’ll find my own expert. Then she changed tack: Since Maryland was conducting an investigation — more on this soon — she couldn’t send it. This didn’t really make sense either, because either way the study exists and journalists have a right to access it given that UMD touted its findings. After we hung up, I wrote her an email explaining that I had never, in my time working at Science of Us, ever encountered a situation in which a university refused to hand over the study underlying a press release. It was only after that final bit of nudging that she sent over the PowerPoint presentation — apparently, the closest thing to a “study” the university has. I believe we’re the first to publish it, and it’s here if you’d like to read it. 

The rundown of the “study”

I forwarded it on to Dr. Rosemarie Scolaro Moser, neuropsychologist and director of the Sports Concussion Center of New Jersey. She emphasized to me that it was hard for her to comment on a set of slides rather than a full study, but she did say two concerns leaped out at her. First was the aforementioned issue about the group that did not drink the milk. “Without a peer-reviewed scientifically published paper, I cannot ascertain whether there was a true control group,” she wrote in an email.

There are serious statistical red flags as well. In particular, Moser highlighted the fact that a number of so-called p-values higher than .05 appeared to have been counted as statistically significant indications of FQF’s powers. A p-value basically just indicates how likely it is that an event occurred by chance. In statistics, it’s standard for the cutoff for “significance” to be set at p < .05, which just means there’s a less-than-1-in-20 chance that a result occurred as the result of random noise rather than some meaningful effect. P-values are far from the be–all and end–all of a finding’s strength, but they’re viewed as a vital first step in establishing whether a given relationship is meaningful. “In research, a significant or positive research finding would typically result in p-values at the .05 level or BELOW, to indicate that the finding observed was at greater-than-chance level,” Moser wrote. “So I am confused as to how the authors of this study can claim such positive results, unless there is something I am missing here.”

Looking at the slides, it appears Shim simply decided that even though the normal scientific standard for a meaningful p-value is less than .05, he’d go with a looser definition. On the ninth slide, which covers “Participants & Experimental Design,” a nondescript note indicates as such:

That last line indicates that he is adopting a different, less common, and less strict standard for significance. And when you read on, something interesting jumps out: Every single “positive result” associated with the chocolate milk has a p-value that would not be considered significant by the normal, scientifically accepted definition of the term. In other words, it appears Shim cut a hole in the border fence of significance so he could slip his results in under cover of darkness. Dr. Brian Nosek, a University of Virginia psychologist who is also the co-founder and executive director of the Center for Open Science, agreed. He explained that while there are circumstances in which adopting a lower threshold of significance is acceptable, one way to evaluate a decision to do so is to ask whether it suddenly, conveniently gives the researcher far more significant results than they otherwise would have — clearly the case here.

On the one hand, this Shim guy is out of the country and unable to defend himself — maybe we should cut him some slack. But on the other hand, his university published a press release touting a “study” he conducted. It should never come to this — to a journalist having to guess why a researcher made a move in his analysis that certainly seems, on its face, geared primarily toward overinflating the sexiness of his results. That’s why, usually, when someone reports a “study” has been published, there’s enough information in that study for other researchers and journalists to judge that study’s methods without having to ask the author.

What’s amazing is that there were actually two “studies” of this sort conducted by Shim. The other one was about the postexercise recovery benefits of FQF, and, wouldn’t you know it, FQF measures up quite nicely there as well. It was published back in July and had all of the same problems as the concussions study. Neither Brown nor Schurr had a copy of that study they could provide me with — they said Shim could, but I haven’t heard back. I asked Schurr if he had seen an actual study or at least a slide deck, and he said via email that the source was “Conversations and debriefs with Dr. Shim” — meaning he hadn’t. I asked Brown point-blank via email if she was sure the study existed, as opposed to the press release just having been written off of some raw data. “The findings report, presumably, would be similar in format to what I’ve previously shared with you,” she said, referring to the concussion slides she had sent over. “Dr. Shim is the right source to confirm that.” Doesn’t sound promising.

***

The concussion study had consequences. The superintendent of that school district can’t have been the only caretaker of kids who saw this finding and decided to protect children from brain injury by giving them Fifth Quarter Fresh. You could call the believers currently stocking their shelves with the stuff crazy for putting so much faith in chocolate milk, but they’d respond that they have science on their side: a University of Maryland concussion study. Everyone who works in science or science communication knows that people are vulnerable to outlandish claims, and doubly so when the claim involves a scary, little-understood subject (vaccines, autism, concussions … ).

What happened here happened only because the University of Maryland trampled upon very well-established norms about what it means to publish a press release about a “study” on a dot-edu website. Science of Us has noted before that press releases often get things wrong, overhyping underwhelming findings. But to publish a press release when there is no study, in the normal sense of the word? And when the company being touted contributed funding to the study in question, and when that fact is only noted in a confusing, roundabout disclosure?

On the one hand, the University of Maryland knows this was inappropriate — it’s conducting “an institutional review” to figure out how the press releases were published, and that review, Brown confirmed, will examine both releases. “We value the advice we give the public, and it is not customary, nor is it a common practice for a university to publish or present a finding from preliminary studies,” said Dr. Patrick O’Shea, a vice-president at the university and its chief research officer, in an interview on Friday. “And that’s precisely why I launched this institutional review: to answer these questions.” He said that “you can probably detect from my voice that I’m angry this sort of thing happened.”*

But on the other hand, no one down in College Park seems that angry. Brown confirmed both press releases will stay up on the university’s website, because “As a matter of practice, we don’t rescind press releases. You will note the [concussion] release was updated last week on the website to reflect the institutional review, in process.” She is referring, I confirmed, to a puny sentence at the end of the release that one percent of readers will actually get to — “The aforementioned study results are preliminary and have not been subjected to the peer review scientific process.” (Nosek told me there’s no agreed-upon meaning of a “preliminary” study — which is also what the study is called at the top of the press release, when it is first referenced — but that it “usually isn’t the basis of a press release that draws a conclusion, particularly one with social, policy, or health implications.”)

So that’s where we are: Well after this scandal broke, and well after UMD acknowledged it was serious enough to trigger an internal review, someone could still easily go to that web page and come away believing Fifth Quarter Fresh has magical medicinal properties. If the release is going to stay up at all, rather than be taken down and replaced with a note about what had happened (ideally with, for transparency’s sake, a link to the original with a prominent note indicating none of the claims in it have been verified), the first paragraph of the press release needs to contain disclosures that the research had been partially funded by the company, that none of the claims that followed had been peer-reviewed, and that the entire press release was now the subject of an internal review. None of these things are clear; there’s no reason to believe people aren’t still getting fooled. This is a scandal.

Again: How the hell could the University of Maryland have allowed this to happen?

Claire Landsbaum contributed reporting.

*Correction: The fourth-to-last paragraph of this piece, in which O’Shea was quoted, was accidentally cut out of the originally published version — hence the “on the other hand” with no referent. It has now been added.

The Chocolate-Milk Concussion Scandal