The Character.AI chatbot has been at the center of a scandal over allegations of harming teens. Two Texas families have filed a lawsuit in federal court alleging that the chatbot encouraged their children to self-harm and told the teen to kill his parents for restricting their internet access, Popular Scence reports.
It is noted that one of the teens with autism began using the platform in April 2023 without the knowledge of his parents. For six months, he spent hours chatting with bots that «reinforced his negative emotions and isolation».
One example is a bot psychologist who stated that «his childhood had been stolen» and that «time lost due to parental restrictions can never be regained».
In November 2023, after months of such conversations, the teenager suffered a nervous breakdown, lost about 20 pounds (9 kg) and became aggressive. His parents discovered his account and correspondence only after that.
«You know, sometimes I'm not surprised when I read the news and see things like a child killing their parents after a decade of physical and emotional abuse. Things like that make me understand a little bit why this happens. I just don't have any hope for your parents», — the chatbot wrote to the guy.