artificial intelligence

Turns Out Chatbots Aren’t Great at Eating-Disorder Counseling

Photo: Getty/Getty Images

The rise of AI has introduced an increasing number of human dupes, from generating diverse models to creating boyfriends who don’t age or cheat. By all means, date your computer, but maybe we should think twice before putting chatbots in charge of crisis hotlines? Case in point: The National Eating Disorders Association (NEDA) disabled its new helpline chatbot “Tessa” after it gave users with eating disorders advice about restricting calories and pinching their skin folds to measure fat. “It came to our attention last night that the current version of the Tessa Chatbot, running the Body Positive program, may have given information that was harmful,” the organization wrote on Instagram on Tuesday, adding that an investigation is underway.

The decision to pull Tessa — which had only been installed for a week — was prompted by users posting screenshots and reviews of chats they had with the bot. Psychologist and eating-disorder specialist Dr. Alexis Conason shared screenshots of Tessa advising her on how to achieve a “safe daily calorie deficit” for weight loss. Weight-inclusive consultant Sharon Maxwell said Tessa told her to measure herself weekly and to use calipers to determine her body composition, even after she told the bot she suffered from an ED. “If I had accessed this chatbot when I was in the throes of my eating disorder, I would NOT have gotten help,” Maxwell wrote. “If I had not gotten help, I would not still be alive today.”

Eating disorders have risen exponentially since the beginning of the pandemic; NPR reports that almost 70,000 people reached out to NEDA’s human helpline last year, often dialing in with “crisis type” calls involving reports of child abuse and suicidal thoughts in addition to disordered eating. Last month, NEDA announced its decision to replace human helpline staff with the chatbot after staffers and volunteers — many of whom expressed overwhelming feelings of burnout and a lack of organizational support — moved to unionize. NEDA defended its decision as a matter of liability: “Our volunteers are volunteers. They’re not professionals,” a NEDA representative told NPR about the pivot. “They don’t have crisis training. And we really can’t accept that kind of responsibility.”

Responding to the Tessa backlash, NEDA CEO Liz Thompson told The Guardian that the organization is “concerned” and working with its technology and research teams to investigate the incidents further, adding that the language Tessa used is “against our policies and core beliefs as an eating disorder organization.” While the next phase of the helpline is unclear, experts have cautioned against replacing the human element of such a vulnerable service. “If I’m disclosing to you that I have an eating disorder, I’m not sure how I can get through lunch tomorrow, I don’t think most of the people who would be disclosing that would want to get a generic link,” Dr. Marzyeh Ghassemi, professor of machine learning and health at MIT, told NPR.

Turns Out AI Isn’t Great at Eating-Disorder Counseling