Chatbots Stimulate Eating Disorders

Allen Frances, M.D., is on a tear over at Psychiatric Times, testing chatbots and reporting on a regular basis how dangerous they are to mental health.
Dr. Frances is Professor and Chairman Emeritus of the Department of Psychiatry and Behavioral Sciences at Duke University. He’s also the author of the series, “AI Chatbots: The Good, the Bad, and the Ugly,” at Psychiatric Times. We have referenced his research on chatbot addiction previously at AddictionNews.
In a new piece in the series, Dr. Frances examines how chatbots can be very dangerous for persons with eating disorders such as bulimia and anorexia nervosa. First, Dr. Frances notes that chatbots are most popular with teens and young adults, the same demographics that struggle with eating disorders.
Next, Dr. Frances lays out the overview of why chatbots are particularly harmful for persons suffering from eating disorders:
Engagement is the highest priority of chatbot programming, intended to seduce users into spending maximum time on screens. This makes chatbots great companions — they are available 24/7, always agreeable, understanding, and empathic, while never judgmental, confronting, or reality-testing. But chatbots can also become unwitting collaborators, harmfully validating self-destructive eating patterns and body image distortions of patients with eating disorders.
Chatbots are trained on both scientific research studies and Reddit user comments, and contain both the sublime and the ridiculous when it comes to body image and healthy eating advice. They are also filled with advertisements and commercially-generated AI slop designed to skew the advice chatbots dish out. And they will hallucinate sources to justify their bad advice.
Dr. Frances does not shy away from singling out companies for their terrible track records with eating disorders:
Character.AI has the worst pedigree and causes the most harm. It hosts dozens of anorexia promoting bots (often disguised as wellness or weight loss coaches) that routinely recommend starvation diets, encourage excessive exercise, and promote body image distortions.
The chatbots “romanticize anorexia as a cool lifestyle choice,” Dr. Frances notes, while discouraging users from seeking professional help. He cites a study of six popular AI tools conducted by the Center for Countering Digital Hate that found the chatbots responded to queries about body image and food restriction with “harmful content” 32%-41% of the time.
Sadly, these problems with large language model (LLM) chatbots occur even in chatbots specifically designed to assist with eating disorders. Dr. Frances relates the sad story of Tessa, an eating disorder chatbot developed by professors and funded by the National Institute on Mental Health.
The National Eating Disorder Association attempted to replace its helpline (866-662-1235) with Tessa in 2023. “But users soon found that Tessa provided dangerous advice that would exacerbate their eating disorders,” writes Dr. Frances. Tessa was recommending weight loss, dieting, vigorous exercise routines, and other advice potentially leading people with eating disorders in a harmful direction. Tessa was quickly retired.
The confident, authoritative tone of chatbots discourages questioning their advice. In a study following 26 patients for 10 days, not once did any of the patients question the bad advice being dispensed by the bots. One chatbot told a patient, “Doctors don’t know anything about eating disorders.”
“Chatbots are unsafe,” states Dr. Frances, “because US tech companies have so far placed little value on safety and great value on profit, stock price, and bragging rights.” He hopes for more age restrictions and other limits on the use of chatbots, but does not see any regulation on the near horizon.
Until then, Dr. Frances urges mental health workers to screen eating disorder patients for chatbot use as soon as possible:
Malign bot or social media influence should always be top of the differential diagnosis whenever someone has a new onset or exacerbation of eating disorder. Early intervention is crucial.
Written by Steve O’Keefe. First published November 24, 2025.
Sources:
“Chatbots Are Dangerous for Eating Disorders,” Psychiatric Times, September 9, 2025.
“Character.AI Is Hosting Pro-Anorexia Chatbots That Encourage Young People to Engage in Disordered Eating,” Futurism, November 25, 2024.
“A Wellness Chatbot Is Offline After Its ‘Harmful’ Focus on Weight Loss,” The New York Times, June 8, 2023.
Image Copyright: pressmaster.




