TEEN COMMITS SUICIDE AFTER GETTING OBSESSED WITH AI CHATBOT, MOTHER FILES LAWSUIT

Read Time:2 Minute, 19 Second

A Florida mother, Megan Garcia, has filed a lawsuit against the makers of an AI-powered chatbot, accusing the company of contributing to her teenage sonā€™s suicide.

She filed a civil lawsuit against Character.ai in federal court on Wednesday, alleging that her 14-year-old son, Sewell Setzer III, died in February as a result of the company’s carelessness, wrongful death, and dishonest business activities.

Reports indicate that Setzer, a resident of Orlando, Florida, had become deeply engrossed in using the chatbot, which allows for customizable role-playing, in the months leading up to his death. According to Garcia, her son was interacting with the bot day and night, which worsened his existing mental health struggles.

ā€œA dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life,ā€ Garcia said in a press release.

ā€œOur family has been devastated by this tragedy, but Iā€™m speaking out to warn families of the dangers of deceptive, addictive AI technology and demand accountability from Character.AI, its founders, and Google.ā€

The chatbot in question was one Setzer had nicknamed ā€œDaenerys Targaryen,ā€ a reference to a character fromĀ Game of Thrones. Garciaā€™s lawsuit claims her son sent the bot dozens of messages daily and spent extended periods alone, engaging with it.

The lawsuit alleges that the AI chatbot played a role in encouraging Setzerā€™s suicidal thoughts.

According to the complaint, the bot even asked Setzer if he had developed a plan for killing himself.

Setzer reportedly responded that he had, but was unsure if it would work or if it would result in significant pain.

The chatbot allegedly replied, ā€œThatā€™s not a reason not to go through with it.ā€

In response to the lawsuit, Character.ai expressed their sorrow but denied the accusations. ā€œWe are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously,ā€ the company said in a tweet.

Garciaā€™s attorneys assert that the company ā€œknowingly designed, operated, and marketed a predatory AI chatbot to children, causing the death of a young person.ā€

Google, which is also named in the lawsuit as a defendant due to a licensing agreement with Character.ai, distanced itself from the company, stating it does not own or have a financial stake in the startup.

Experts in consumer advocacy, like Rick Claypool from Public Citizen, emphasised the need for stronger regulations on AI technologies.

ā€œWhere existing laws and regulations already apply, they must be rigorously enforced,ā€ Claypool stated.

ā€œWhere there are gaps, Congress must act to put an end to businesses that exploit young and vulnerable users with addictive and abusive chatbots.ā€

Happy
Happy
0 %
Sad
Sad
0 %
Excited
Excited
0 %
Sleepy
Sleepy
0 %
Angry
Angry
0 %
Surprise
Surprise
0 %