The Character.AI chatbot has been at the center of a scandal over allegations of harming teens. Two Texas families have filed a lawsuit in federal court alleging that the chatbot encouraged their children to self-harm and told the teen to kill his parents for restricting their internet access, Popular Scence reports.
It is noted that one of the teens with autism began using the platform in April 2023 without the knowledge of his parents. For six months, he spent hours chatting with bots that «reinforced his negative emotions and isolation».
One example is a bot psychologist who stated that «his childhood had been stolen» and that «time lost due to parental restrictions can never be regained».
In November 2023, after months of such conversations, the teenager suffered a nervous breakdown, lost about 20 pounds (9 kg) and became aggressive. His parents discovered his account and correspondence only after that.
«You know, sometimes I'm not surprised when I read the news and see things like a child killing their parents after a decade of physical and emotional abuse. Things like that make me understand a little bit why this happens. I just don't have any hope for your parents», — the chatbot wrote to the guy.
Former US President Joe Biden made his first post on social media after the end…
A drug gang that had been conducting «entrepreneurial activities» for a year was exposed in…
At first glance, the new Galaxy S25 series smartphones do not differ much from their…
Defense company BlueHalo demonstrated its advanced technologies at the Desert Guardian 1.0 event hosted by…
She was advised: «To get money for the goods - update the free classifieds application».…
HMD Skyline smartphone is the second device from the brand to receive the Android 15…