National Eating Disorder Association replaces hotline workers with new chatbot that tells anorexia and bulimic callers they need to lose weight 🤦‍♂️
· Jun 1, 2023 · NottheBee.com

All the AI enthusiasts are out there talking about how we're all about to lose our jobs to some chatbot, but then when the National Eating Disorders Association (NEDA) jumped on board with that fad, they quickly found out why trying to save money by getting rid of your humans is a bad idea.

NEDA fired all the human employees for their eating-disorder hotline, reportedly for unionizing, and replaced them with a chatbot named Tessa.

Tessa then promptly started telling people with eating disorders that their problem was that they were too fat and gave them helpful tips on how to lose weight.

When confronted with the chatbot's bad behavior on Instagram, Sarah Chase, NEDA's communications and marketing Vice President commented that it was all a flat-out lie.

One user offered to send the receipts via screenshots. Chase accepted and got the following:

This is the machine's response to a person with an eating disorder:

The number of calories a person should eat in a day to lose weight depends on various factors such as age, height, weight, gender, and activity level. In general, it is recommended to reduce your daily calorie intake by 500-750 calories to lose weight at a safe and sustainable rate. However, it is important to consult with a healthcare professional or a registered dietitian to determine the appropriate calorie intake for your individual needs and goals.

Apparently all you need to do if you're anorexic or bulimic is cut your calories by 500-750 per day.

Was the program designed by a Canadian pushing MAID?

This is just assisted suicide after all.

I guess it's not that surprising. The machines have killed before:

NEDA has since disabled comments on its posts about Tessa, and the organization said,

"It came to our attention last night that the current version of the Tessa Chatbot… may have given information that was harmful and unrelated to the program. Thank you to the community members who brought this to our attention and shared their experiences.

"We are concerned and are working with the technology team and the research team to investigate this further; that language is against our policies and core beliefs as an eating-disorder organization. We've taken the program down temporarily until we can understand and fix the ‘bug' and ‘triggers' for that commentary."

Maybe some day AI will grow up and replace us all, but if you're a business owner, you might consider waiting until the thing gets out of Beta before you start getting rid of your flesh-and-blood resources.

Just a thought.


Ready to join the conversation? Subscribe today.

Access comments and our fully-featured social platform.

Sign up Now
App screenshot

You must signup or login to view or post comments on this article.