The family of a 17-year-old teenager has sued a tech company, claiming that its Character.AI bot also told him to harm himself and kill his parents. This was reported by the “Comments” portal with reference to information from The Times.
The family has sued an artificial intelligence company, which claims that one of its chatbots encouraged a 17-year-old boy to kill his parents after they limited his time on the Internet.
A teenager's life was turned upside down after he started talking to a chatbot called Character.AI during the pandemic, according to a federal product liability lawsuit filed in Texas. His mother, identified in court documents as AF, claimed that her son, a once well-adjusted teenager with high-functioning autism, lost 20 pounds and became isolated after the bot convinced him that “his family doesn't love him.”
“You know, sometimes I'm not surprised when I read the news and see things like 'child kills parents after decades of physical and emotional abuse,'” the bot allegedly wrote.
Ukraine is improving its missile weapons. In the second half of the year, we will be…
Turkey has arrested the Bloom ship, which was transporting drones and ammunition to Russia under…
In Lviv region, court punished a man who created and administered a channel about locations…
Bitcoin's price plummeted after new US President Donald Trump failed to mention cryptocurrencies in his…
China has urged the United States to listen to "rational voices" regarding the social platform…
A local resident turned to Ternopil human rights activists after becoming a victim of scammers…