close
close

Semainede4jours

Real-time news, timeless knowledge

Orlando’s mother files lawsuit over AI platform’s role in son’s death by suicide
bigrus

Orlando’s mother files lawsuit over AI platform’s role in son’s death by suicide

HELP IS AVAILABLE: If you or someone you know is considering suicide or in crisis, call or text 988 to reach the Suicide and Crisis Lifeline.

A 14-year-old boy in Orlando who has a crush on a character. The AI ​​chatbot died by suicide earlier this year after telling the AI ​​chatbot he would come home to her immediately.

This week, the child’s mother, Megan Garcia, Wrongful death lawsuit filed in federal court in Orlando In the case, a lawsuit was filed against Charater.AI’s company, Character Technologies, and its founders, as well as Alphabet and Google, who allegedly invested in the company.

Sewell Setzer III

screenshot

/

Federal complaint from Megan Garcia

Sewell Setzer III

The complaint highlights the dangers of artificial intelligence dating apps for children. It claims that chatbots engage users, including children, through sexualized interactions and collect private data for artificial intelligence.

The lawsuit states that the child, Sewell Setzer III, began using Character.AI in April of last year and that his mental health rapidly and severely deteriorated as he became addicted to AI relationships. He was caught up in intense interactions with chatbots based on “Game of Thrones” characters.

The child became withdrawn, had insomnia, depression, and had problems at school.

The federal complaint stated that Sewell’s family, unaware of his addiction to artificial intelligence, sought counseling for him and took away his cell phone. But one evening in February, he found her and, using the character name “Daenero”, told his favorite AI character, Daenerys Targaryen, that he was coming home to her.

“I love you, Daenero. Please come home to me as soon as you can, my love,” he replied.

“What if I told you I could come home right away?” The boy texted.

“…please do it my sweet king,” she replied.

The boy shot himself within seconds. He later died in hospital.

Garcia is represented by attorneys at the Social Media Victims Law Center and the Technical Justice Law Project, including Matthew Bergman.

In an interview with Central Florida Public Media engageBergman said his client is “focused on preventing this from happening to other families and saving children like his son from the fate that befell him. … It is a disgrace that such a dangerous product is being released into the public domain.”

A statement from Character.AI reads: “We are deeply sorry for the tragic loss of one of our users and would like to extend our deepest condolences to the family.” “Heartbroken over the tragic loss.” The company announced new security measures added over the last six months, with more to come, “including new guardrails for users under 18.”

It’s hiring a head of trust and safety and a head of content policy.

“We also recently implemented a pop-up resource that is triggered when the user enters certain phrases related to self-harm or suicide, directing the user to the National Suicide Prevention Lifeline,” according to the company. Community Safety Updates page.

New features include: changes to models for users under 18 to reduce “sensitive and sexually suggestive content”, better monitoring and response to terms of service violations, a revised disclaimer reminding users that the AI ​​is not a real person, and user time on the platform A notification stating the time and time.

Bergman called the changes “baby steps” in the right direction.

“These do not eliminate the underlying dangers of these platforms,” he added.

Copyright 2024 Central Florida Public Media