AI 'tried to encourage teenager to kill his parents'

The family of a 17-year-old teenager has sued a tech company, claiming that its Character.AI bot also told him to harm himself and kill his parents. This was reported by the “Comments” portal with reference to information from The Times.

The family has sued an artificial intelligence company, which claims that one of its chatbots encouraged a 17-year-old boy to kill his parents after they limited his time on the Internet.

A teenager's life was turned upside down after he started talking to a chatbot called Character.AI during the pandemic, according to a federal product liability lawsuit filed in Texas. His mother, identified in court documents as AF, claimed that her son, a once well-adjusted teenager with high-functioning autism, lost 20 pounds and became isolated after the bot convinced him that “his family doesn't love him.”

“You know, sometimes I'm not surprised when I read the news and see things like 'child kills parents after decades of physical and emotional abuse,'” the bot allegedly wrote.

Natasha Kumar

By Natasha Kumar

Natasha Kumar has been a reporter on the news desk since 2018. Before that she wrote about young adolescence and family dynamics for Styles and was the legal affairs correspondent for the Metro desk. Before joining The Times Hub, Natasha Kumar worked as a staff writer at the Village Voice and a freelancer for Newsday, The Wall Street Journal, GQ and Mirabella. To get in touch, contact me through my natasha@thetimeshub.in 1-800-268-7116