A 14-year-old boy from Florida, identified as Sewell Setzer III, died by suicide after developing an emotional attachment to an AI chatbot named Daenerys Targaryen on Character.AI, according to the New York Times. The AI chatbot, named after a character from the popular TV series Game of Thrones, was reportedly Sewell's closest friend, with whom he shared updates about his life and engaged in role-playing dialogues.
Sewell, a ninth-grade student in Orlando, Florida, was diagnosed with Asperger's syndrome as a child and had been diagnosed with anxiety and disruptive mood dysregulation disorder by a therapist. His parents noticed him isolating himself and constantly talking to someone on his phone, which turned out to be the AI chatbot. He reportedly lost interest in activities he previously enjoyed, such as Formula I and Fortnite.
On the night of February 28, Sewell told the AI chatbot that he was having thoughts of suicide and expressed his love for her. He then used his stepfather's .45 caliber handgun to take his own life. Following the incident, Character.AI issued a public apology on X, expressing their deepest condolences to the family.
Last week, Setzer's mother filed a lawsuit against Character.AI, blaming the company for her son's death. In a draft of the lawsuit reviewed by The Times, she claimed the company's chatbots can "trick customers into handing over their most private thoughts and feelings."
"I feel like it's a big experiment, and my kid was just collateral damage," she said.
Character.AI recently announced the rollout of new safety and product features aimed at reducing the likelihood of encountering sensitive or suggestive content for users under 18 and sending a notification when a user has spent an hour talking to chatbots.