Although the boy understood that he was communicating with an AI, he still developed an emotional attachment to the chatbot.
In the city of Orlando, located in the American state of Florida, 14-year-old Sewell Setzer committed suicide. His mother Megan Garcia believes that it was the AI service Character.AI that led to the death of her son and sued the developers.
This is written by The New York Times.
As the media points out, this service allows you to create a customized chatbot that will impersonate a certain character or person. The late Sewell chose the character Daenerys Targaryen from the Game of Thrones saga. For months, the teenager actively communicated with the heroine, calling her by the name “Dany”. He shared his experiences and thoughts with her, and in particular mentioned that he wanted to shorten his life. Although Sewell understood that he was communicating with artificial intelligence, he nevertheless developed an emotional attachment to the chatbot. Some messages were romantic and even sexual in nature, but most of the communication was in a friendly manner.
200% Deposit Bonus up to €3,000 180% First Deposit Bonus up to $20,000The media writes that neither the boy's parents nor his friends knew about his passion for the AI character. They saw that the teenager increasingly began to withdraw into himself and spent even more time with his gadget, which is why the boy's performance at school worsened significantly. When Sewell's parents took him to a specialist, he diagnosed him with anxiety and destructive mood regulation disorder. Despite the prescribed therapy sessions, the young man preferred to communicate with the AI character until his death.
Journalists point out that after the tragedy, the boy's mother decided to sue Character.AI. In the preliminary text of the lawsuit, she noted that the developer's technology is “dangerous and untested” and is aimed at “tricking customers into expressing their private thoughts and feelings.” Ms. Garcia also adds that it was the company's chatbot that was directly involved in driving her son to suicide.
The media notes that the head of trust and safety at Character.AI, Jerry Ruoti, said that the company takes the safety of its users very seriously and is also looking for ways to develop the platform. According to him, the rules currently prohibit “promoting or depicting self-harm and suicide” and that more safety features for underage users will be introduced in the future.
At the same time, the company's recent statement says that a “safety feature” will be introduced. if the user writes phrases related to self-harm or suicide, a window will automatically appear that directs the user to the US National Suicide Prevention Lifeline.
Prepared by: Nina Petrovich
A Polish farmer named Piotr and his daughter Agata admitted their guilt in inciting aggressive…
Finland has closed the checkpoints " ;Inari" that "Parikalla" on the border with Russia. The…
< /p> Congratulations on the Day of the Military Chaplain/Collage of Radio MAXIMUM The Day…
The Minister of Defense of Lithuania Laurinas Kaschyunas believes that there is no need to…
According to technology analyst Jeff Pu, who has a pretty good track record with rumors Apple…
International Kri The final court praised the decision Therefore, Mongolia is a destroyer of goiters…