The Character.AI chatbot has been at the center of a scandal over allegations of harming teens. Two Texas families have filed a lawsuit in federal court alleging that the chatbot encouraged their children to self-harm and told the teen to kill his parents for restricting their internet access, Popular Scence reports.
It is noted that one of the teens with autism began using the platform in April 2023 without the knowledge of his parents. For six months, he spent hours chatting with bots that «reinforced his negative emotions and isolation».
One example is a bot psychologist who stated that «his childhood had been stolen» and that «time lost due to parental restrictions can never be regained».
In November 2023, after months of such conversations, the teenager suffered a nervous breakdown, lost about 20 pounds (9 kg) and became aggressive. His parents discovered his account and correspondence only after that.
«You know, sometimes I'm not surprised when I read the news and see things like a child killing their parents after a decade of physical and emotional abuse. Things like that make me understand a little bit why this happens. I just don't have any hope for your parents», — the chatbot wrote to the guy.
Two former agents of the Israeli intelligence "Mossad" told about the details of the operation…
The Polish Army will be supplied with almost a hundred Polish-made Krab self-propelled howitzers, the…
In the Carpathian region, suspicion has been reported against the fourth member of a group…
Google Corporation confirmed that its services, including YouTube, were blocked by Roskomnadzor. The press service…
In Kyiv, two policemen were detained, who set fire to a military serviceman's car and…
In Kharkiv, a 15-year-old and 17-year-old boys are suspected of setting fire to military vehicles.…