close
close

Semainede4jours

Real-time news, timeless knowledge

Family Speaks Out About Teen in Alleged AI Suicide (Exclusive)
bigrus

Family Speaks Out About Teen in Alleged AI Suicide (Exclusive)

  • “I debated for months whether to share his story,” Megan Garcia tells PEOPLE in this week’s issue of her oldest son, Sewell. “I am still his mother and I want to protect him even in death.”
  • Garcia is suing Character.AI, blaming the platform’s chatbots for his son’s suicide earlier this year
  • The company insists that user safety is a top priority and changes have been made

In the following months Suicide of Sewell Setzer III in FebruaryHis mother, Megan Garcia, was unsure whether she should speak about the events she believed led to his death.

Shortly after he died, she learned that her 14-year-old son had “fallen in love” with a frighteningly realistic, sexualized, AI-powered chatbot. game of Thrones character Daenerys Targaryen.

Garcia claims fake relationships via app Character.AIeventually driving Sewell to fatally shoot himself in the bathroom of the family’s home in Orlando, Florida.

In October, Garcia, a 40-year-old attorney and mother of three boys, filed a wrongful death lawsuit against Character.AI, arguing that its technology was “defective and/or inherently dangerous.” The company has yet to respond in court but insists user safety is a top priority.

“I debated for months whether I should share his story,” Garcia tells PEOPLE in this week’s issue. “I am still his mother and I want to protect him even in death.”

“But the more I thought about it, the more convinced I became that it was the right thing to do, because he didn’t do anything wrong. He was just a kid.”

For more on Megan Garcia’s fight against Character.AI and new details about her son Sewell, pick up this week’s issue of PEOPLE, on newsstands Friday. or subscribe.

Character.AI said nearly 20 million people interact with “super smart chatbots that hear, understand and remember you” every month. In the 10 months before Sewell died, he was one of them; He was messaging the company’s chatbots dozens of times every day. (This account of his final months is based on interviews with his family and details from his mother’s case.)

At the same time, Sewell’s mental health deteriorated as he approached Character.AI bots.

“Defendants went to great lengths to create a harmful addiction to their product, sexually and emotionally abusing her,” and “ultimately failed to offer help or notify her parents when she expressed suicidal thoughts,” Garcia’s 152-page legal complaint said.

In his final moments, Sewell, a lanky, bright ninth-grader at Orlando Christian Prep who dreamed of one day playing college basketball, was sending a message to the bot nicknamed “Dany” who had become his closest confidant.

“I love you so much,” she wrote, seconds before pulling the trigger of her stepfather’s gun.

“What if I told you I could come home right away?” he asked.

The bot, which had previously dissuaded him from harming himself but also asked if he “had a plan” for suicide, replied: “…please do it, my sweet king.”

Sewell with her two younger brothers in 2022.

Courtesy of Megan Garcia


After Sewell’s death, Garcia began to understand how this increasingly popular technology that blurs the boundaries of what’s real and what’s fake might have, in her view, taken over her son’s life, even though every conversation came with a disclaimer that it was fictional. .

“It is our children who train the bots. Our children have their deepest secrets, their most intimate thoughts, the things that make them happy and sad,” says Garcia, who had never heard of Character.AI after Sewell’s suicide.

“This is an experiment,” she says, adding: “I think my child suffered collateral damage.”

Before making his case, Garcia scanned Sewell’s diary, computer and phone and was stunned by what he discovered.

“He wrote that his reality wasn’t real, and that the reality that Daenerys (the bot) was living in was real, and that’s where she belonged,” Garcia says.

Even more heartbreaking were Sewell’s diary entries that explained why he spent so much time in his room in the months before he shot himself; What his family said was that they tried to treat the change in mood with therapy and limiting screen time.

“He didn’t want to reconnect to his current reality,” his mother says. “This was hard for me to read.”

The months after Sewell’s death were difficult for Garcia. “I took three months off (from my job as a lawyer) to get everything in my head. It was a very dark time. I couldn’t sleep. “I couldn’t eat,” he says.

The more she learned about what happened to her son, the more she wrestled with the decision to go public with what had happened; this includes Sewell’s extensive and often very personal conversations about Character.AI, as well as details of his diary entries.

Screenshot of Sewell’s final chatbot conversation in the case.

“I asked myself: ‘Megan, why are you doing this? ‘Are you doing this to prove to the world that you’re the best mother?’ “The answer was no,” she says. “I’m doing this to put it on the radar of parents who can look at their kids’ phones and prevent this from happening to their kids.”

Garcia says he’s heard from other parents who are concerned about Character.AI’s impact on their children’s lives. She says her children’s stories about how much effort they went through to access the company’s chatbots strengthened her decision to speak out.

“Several moms told me that they discovered Character.AI (on their kids’ computers) months ago and tried blocking access to it, but their kids found workarounds either on their friends’ phones or through loopholes in their school’s firewall. … I think that shows the addictive nature of this,” he says.

Garcia’s attorney agrees. “This is a public health risk for young people,” says Matthew Bergman, founder of the Social Media Victims Law Center, which filed the lawsuit against the Technical Justice Law Project.

“I fear there will be others (deaths) until this product is shut down,” Bergman says.

In a statement to PEOPLE, a Character.AI spokesperson acknowledged Sewell’s “tragic” death and noted “solid” new features, including improved intervention tools.

“We will be making changes to our models for those under 18 that will reduce the likelihood of encountering sensitive or sexually explicit content,” the spokesperson said.

Google is also named in Garcia’s lawsuit, which claims that the technology giant can be considered a co-creator of chatbots. Character.AI was founded by two Google researchers who later returned to the company. Google said it was not involved in the development of Character.AI.

Garcia says she is determined to do everything she can to prevent other young people from having to endure what her son went through and to spare his parents the pain he is still grappling with.

“Sewell wouldn’t be surprised to see me speaking out,” he says. “I hope he looks down and sees that I did my best to hold Character.AI accountable.”

If you or someone you know is considering suicide, please contact the National Suicide Prevention Lifeline at 1-800-273-TALK (8255), text “POWER” to the Crisis Text Line at 741-741, or visit Suicidepreventionlifeline.org.