The family of a 17-year-old teenager has sued a tech company, claiming that its Character.AI bot also told him to harm himself and kill his parents. This was reported by the “Comments” portal with reference to information from The Times.
The family has sued an artificial intelligence company, which claims that one of its chatbots encouraged a 17-year-old boy to kill his parents after they limited his time on the Internet.
A teenager's life was turned upside down after he started talking to a chatbot called Character.AI during the pandemic, according to a federal product liability lawsuit filed in Texas. His mother, identified in court documents as AF, claimed that her son, a once well-adjusted teenager with high-functioning autism, lost 20 pounds and became isolated after the bot convinced him that “his family doesn't love him.”
“You know, sometimes I'm not surprised when I read the news and see things like 'child kills parents after decades of physical and emotional abuse,'” the bot allegedly wrote.
< img src = "/uploads/blogs/5a/1d/ib-freeKb5oe_79dea2b8.jpg" Alt = "in the Kyiv region condemned a man for…
< img src = "/uploads/blogs/6c/98/IB-FRFB7GMJ5_222A3C5D5D5.jpg" Alt = "in Kharkiv condemned the Russian gunner who was…
Katarzyna Cichopek and Maciej Kurzajewski have been one of the hottest couples of Polish show…
< img src = "/uploads/blogs/a5/16/IB-FRFG3I9ls_6ca7030f.jpg" Alt = "at once three NATO reconnaissance aircraft started patrol…
< IMG SRC = "/Uploads/Blogs/F6/B9/IB-FRFGC0ved_2A6BBB1CB.jpg" Alt = "Ukraine received thousands of Starlink new generation"/> ~…
< img src = "/uploads/blogs/63/99/ib-freectk_316240df.jpg" Alt = "The fastest match in Dota Cybersport Sports History…