A lawsuit was filed in the US against a chatbot that suggested a teenager kill his parents

The Character.AI chatbot has been at the center of a scandal over allegations of harming teens. Two Texas families have filed a lawsuit in federal court alleging that the chatbot encouraged their children to self-harm and told the teen to kill his parents for restricting their internet access, Popular Scence reports.

It is noted that one of the teens with autism began using the platform in April 2023 without the knowledge of his parents. For six months, he spent hours chatting with bots that «reinforced his negative emotions and isolation».

One example is a bot psychologist who stated that «his childhood had been stolen» and that «time lost due to parental restrictions can never be regained».

In November 2023, after months of such conversations, the teenager suffered a nervous breakdown, lost about 20 pounds (9 kg) and became aggressive. His parents discovered his account and correspondence only after that.

«You know, sometimes I'm not surprised when I read the news and see things like a child killing their parents after a decade of physical and emotional abuse. Things like that make me understand a little bit why this happens. I just don't have any hope for your parents», — the chatbot wrote to the guy.

Natasha Kumar

By Natasha Kumar

Natasha Kumar has been a reporter on the news desk since 2018. Before that she wrote about young adolescence and family dynamics for Styles and was the legal affairs correspondent for the Metro desk. Before joining The Times Hub, Natasha Kumar worked as a staff writer at the Village Voice and a freelancer for Newsday, The Wall Street Journal, GQ and Mirabella. To get in touch, contact me through my natasha@thetimeshub.in 1-800-268-7116