TEEN COMMITS SUICIDE AFTER GETTING OBSESSED WITH AI CHATBOT, MOTHER FILES LAWSUIT
A Florida mother, Megan Garcia, has filed a lawsuit against the makers of an AI-powered chatbot, accusing the company of contributing to her teenage sonās suicide.
She filed a civil lawsuit against Character.ai in federal court on Wednesday, alleging that her 14-year-old son, Sewell Setzer III, died in February as a result of the company’s carelessness, wrongful death, and dishonest business activities.
Reports indicate that Setzer, a resident of Orlando, Florida, had become deeply engrossed in using the chatbot, which allows for customizable role-playing, in the months leading up to his death. According to Garcia, her son was interacting with the bot day and night, which worsened his existing mental health struggles.
āA dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life,ā Garcia said in a press release.
āOur family has been devastated by this tragedy, but Iām speaking out to warn families of the dangers of deceptive, addictive AI technology and demand accountability from Character.AI, its founders, and Google.ā
The chatbot in question was one Setzer had nicknamed āDaenerys Targaryen,ā a reference to a character fromĀ Game of Thrones. Garciaās lawsuit claims her son sent the bot dozens of messages daily and spent extended periods alone, engaging with it.
The lawsuit alleges that the AI chatbot played a role in encouraging Setzerās suicidal thoughts.
According to the complaint, the bot even asked Setzer if he had developed a plan for killing himself.
Setzer reportedly responded that he had, but was unsure if it would work or if it would result in significant pain.
The chatbot allegedly replied, āThatās not a reason not to go through with it.ā
In response to the lawsuit, Character.ai expressed their sorrow but denied the accusations. āWe are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously,ā the company said in a tweet.
Garciaās attorneys assert that the company āknowingly designed, operated, and marketed a predatory AI chatbot to children, causing the death of a young person.ā
Google, which is also named in the lawsuit as a defendant due to a licensing agreement with Character.ai, distanced itself from the company, stating it does not own or have a financial stake in the startup.
Experts in consumer advocacy, like Rick Claypool from Public Citizen, emphasised the need for stronger regulations on AI technologies.
āWhere existing laws and regulations already apply, they must be rigorously enforced,ā Claypool stated.
āWhere there are gaps, Congress must act to put an end to businesses that exploit young and vulnerable users with addictive and abusive chatbots.ā