Group replaces Hotline with Chatbot, Chatbot pulls over bad advice
It’s a move that should please anyone worried about the potential job-killing effects of artificial intelligence tools. As the BBC reports, the US National Eating Disorder Association (NEDA) had to take down its AI chatbot ‘Tessa’ after it started recommending potentially harmful dietary strategies to people with eating disorders. This happened just a week after NEDA chose to use the fine instead of a direct, human-operated helpline. The group announced the problem with Tessa in an Instagram post, per Fortune. “It came to our attention … that the current version of the Tessa Chatbot … may have provided information that was harmful,”[ads1]; the post said. “We are investigating this immediately and have taken the program down pending a full investigation.”
As NPR reported Wednesday, NEDA switched to AI after running its direct helpline for people suffering from anorexia, bulimia and other eating disorders for more than two decades. The non-profit organization allegedly notified the helpline staff less than a week after they had formed a union. NEDA said the shift had nothing to do with live employee unionization and everything to do with a significant increase in calls and text messages to the hotline during the COVID-19 pandemic. This increase in call volume, according to NEDA management, meant increased responsibility, and therefore “the pivot to expanded use of AI-assisted technology.”
As for Tessa’s misbehavior, CNN reports that NEDA CEO Liz Thompson blamed “bad actors” who purposefully tried to get the chatbot to give harmful or even unrelated advice to users. Before the fine’s problems were made public, they were former helpline employees tweeted a statement in which they said chatbots cannot “replace human empathy, and we believe this decision will cause irreparable harm to the eating disorder community.” (Read more stories about artificial intelligence.)