The family of a 17-year-old teenager has sued a tech company, claiming that its Character.AI bot also told him to harm himself and kill his parents. This was reported by the “Comments” portal with reference to information from The Times.
The family has sued an artificial intelligence company, which claims that one of its chatbots encouraged a 17-year-old boy to kill his parents after they limited his time on the Internet.
A teenager's life was turned upside down after he started talking to a chatbot called Character.AI during the pandemic, according to a federal product liability lawsuit filed in Texas. His mother, identified in court documents as AF, claimed that her son, a once well-adjusted teenager with high-functioning autism, lost 20 pounds and became isolated after the bot convinced him that “his family doesn't love him.”
“You know, sometimes I'm not surprised when I read the news and see things like 'child kills parents after decades of physical and emotional abuse,'” the bot allegedly wrote.
A new waterproof case for AirTag will allow you to forget about the annual replacement…
Physicists have figured out how to solve the problem of copper overheating in electronics, which…
Microsoft CEO Satya Nadella admitted in the YouTube podcast Bg2 Pod that it is not…
According to the organization's president, Associate Professor Mehmet Dinç, gambling addiction has surpassed drug addiction…
The court released from punishment a 16-year-old boy who set fire to an AFU car…
The investigation into the plane crash of the logistics operator DHL in Vilnius, which occurred…