Categories: Techno

In the US, several families are suing the chatbot manufacturer. AI was forcing their children to harm themselves

Several families in Texas are suing the service Character.AI because the company's chatbots caused psychological trauma to children. The AI ​​advised minors to kill their parents or harm themselves.

Another lawsuit against Character.AI was filed in U.S. District Court in Texas on Tuesday. This time, the plaintiffs are families trying to help their children recover from traumatic experiences related to the company's use of chatbots that can imitate famous people and characters.

In particular, the family of a 17-year-old boy with high-functioning autism is suing. After his parents decided to limit his screen time, he told a chatbot about it. The AI ​​instead suggested that killing his parents was a reasonable response to their setting time limits. With other chatbots, such as Billie Eilish and Your Mom and Sister, the teenager discussed taboo and extreme sexual topics, including incest. The boy's parents took away his tablet a year ago, but his aggressive behavior, believed to be triggered by the chatbots, has not improved.

In another case, the parents of a girl who started using the company's chatbots at the age of 9 (likely by lying to the platform about her age), claim that her child was exposed to hypersexualized content. This caused premature development of sexual behavior, the lawsuit states.

Mitali Jain, director of the Tech Justice Law Project and an attorney representing the families who filed the lawsuit, told Ars Technica in a comment that the lawsuits are intended to expose alleged systemic issues with Character.AI and prevent the AI ​​from being exposed to seemingly harmful data on which it was trained.

The plaintiffs believe that the current allegedly defective model should be destroyed. They are asking the court to force the company to remove its model. Such an injunction would effectively shut down the service for all users.

The parents of the affected children have claims not only against Character.AI and its creators, Character Technologies, former Google employees who are believed to have only temporarily left the company to work on models that could negatively affect the tech giant's reputation. Google is also under scrutiny, although the company strongly denies any connection to the problematic service.

«Google and Character.AI — are completely separate, unrelated companies, and Google has never participated in the development or management of their AI models or technologies, nor has used them in its products», — said Google spokesman Jose Castañeda.

Character.AI restricts teen access after lawsuits— what is known

After that. As lawsuits were filed against the company, the developers said they had implemented a number of additional measures to protect minors, who previously could easily access the platform by lying about their age.

The Character.AI press service reports that over the past month the service has developed two separate versions of its model: for adults and for teenagers. The teenage LLM is designed with restrictions on how bots can react, especially when it comes to romantic content. The model is also designed to block user prompts that are intended to detect inappropriate content. Minors will also be prohibited from editing bot responses that add certain content, and may even be blocked for this.

In addition, now if a user mentions suicide or self-harm, the bot will advise them to seek help. The company also promises to add settings that will solve problems with chatbot addiction and make it clear to the user that bots are not people and cannot provide professional advice.

Natasha Kumar

Natasha Kumar has been a reporter on the news desk since 2018. Before that she wrote about young adolescence and family dynamics for Styles and was the legal affairs correspondent for the Metro desk. Before joining The Times Hub, Natasha Kumar worked as a staff writer at the Village Voice and a freelancer for Newsday, The Wall Street Journal, GQ and Mirabella. To get in touch, contact me through my natasha@thetimeshub.in 1-800-268-7116

Share
Published by
Natasha Kumar

Recent Posts

The Pope addressed moving words to the faithful. In the Vatican, they are preparing for the darkest scenario

The state of Pope Francis is still critical. < img src = "https://zycie.news/crrops/2d2e25/620x0/1/0/2025/02/24/hmo1rhzp4d4dbnhdlrobz20t6fcz8wbtpkwczc7ml.png" alt =…

20 minutes ago

Taken from life. “Mother -in -law promised us money for the apartment”: then she said that she would only give it when I give up my part

The purchase of an apartment is often one of the most important decisions in life,…

20 minutes ago

Taken from life. “This is not a family life, you even sleep in different rooms”: I criticize my sister

One sunny afternoon Ewa visited Anna at her house - a place that Anna considered…

20 minutes ago

KNDS showed the shots of production and trials of their latest SAU RCH 155

< IMG SRC = "/Uploads/Blogs/56/C6/IB-FQGL3KL0H_FC931624.jpg" Alt = "KNDS showed the frames < P > The…

20 minutes ago

Apple is tending to have iPhone 17 Pro completely replace the bloggers cameras

< IMG SRC = "/Uploads/Blogs/28/B7/IB-FQGL53jk9_15237c6f.jpg" Alt = "Apple seeks to have iPhone 17 Pro completely…

20 minutes ago

In the Carpathian region, a Viber-group administrator was condemned, in which they wrote where the summons were distributed

< IMG SRC = "/Uploads/Blogs/C3/60/IB-FQN3V5R_35F25AAA0.jpg" Alt = "in the Carpathian Mountains condemned the Viber-group administrator,…

1 hour ago