In the US, several families are suing the chatbot manufacturer. AI was forcing their children to harm themselves

Several families in Texas are suing the service Character.AI because the company's chatbots caused psychological trauma to children. The AI ​​advised minors to kill their parents or harm themselves.

Another lawsuit against Character.AI was filed in U.S. District Court in Texas on Tuesday. This time, the plaintiffs are families trying to help their children recover from traumatic experiences related to the company's use of chatbots that can imitate famous people and characters.

In particular, the family of a 17-year-old boy with high-functioning autism is suing. After his parents decided to limit his screen time, he told a chatbot about it. The AI ​​instead suggested that killing his parents was a reasonable response to their setting time limits. With other chatbots, such as Billie Eilish and Your Mom and Sister, the teenager discussed taboo and extreme sexual topics, including incest. The boy's parents took away his tablet a year ago, but his aggressive behavior, believed to be triggered by the chatbots, has not improved.

In another case, the parents of a girl who started using the company's chatbots at the age of 9 (likely by lying to the platform about her age), claim that her child was exposed to hypersexualized content. This caused premature development of sexual behavior, the lawsuit states.

Mitali Jain, director of the Tech Justice Law Project and an attorney representing the families who filed the lawsuit, told Ars Technica in a comment that the lawsuits are intended to expose alleged systemic issues with Character.AI and prevent the AI ​​from being exposed to seemingly harmful data on which it was trained.

The plaintiffs believe that the current allegedly defective model should be destroyed. They are asking the court to force the company to remove its model. Such an injunction would effectively shut down the service for all users.

The parents of the affected children have claims not only against Character.AI and its creators, Character Technologies, former Google employees who are believed to have only temporarily left the company to work on models that could negatively affect the tech giant's reputation. Google is also under scrutiny, although the company strongly denies any connection to the problematic service.

«Google and Character.AI — are completely separate, unrelated companies, and Google has never participated in the development or management of their AI models or technologies, nor has used them in its products», — said Google spokesman Jose Castañeda.

Character.AI restricts teen access after lawsuits— what is known

After that. As lawsuits were filed against the company, the developers said they had implemented a number of additional measures to protect minors, who previously could easily access the platform by lying about their age.

The Character.AI press service reports that over the past month the service has developed two separate versions of its model: for adults and for teenagers. The teenage LLM is designed with restrictions on how bots can react, especially when it comes to romantic content. The model is also designed to block user prompts that are intended to detect inappropriate content. Minors will also be prohibited from editing bot responses that add certain content, and may even be blocked for this.

In addition, now if a user mentions suicide or self-harm, the bot will advise them to seek help. The company also promises to add settings that will solve problems with chatbot addiction and make it clear to the user that bots are not people and cannot provide professional advice.

Natasha Kumar

By Natasha Kumar

Natasha Kumar has been a reporter on the news desk since 2018. Before that she wrote about young adolescence and family dynamics for Styles and was the legal affairs correspondent for the Metro desk. Before joining The Times Hub, Natasha Kumar worked as a staff writer at the Village Voice and a freelancer for Newsday, The Wall Street Journal, GQ and Mirabella. To get in touch, contact me through my natasha@thetimeshub.in 1-800-268-7116