Who Was Sewell Setzer? Teen Dies After Daenerys Role-Play On AI Bot Goes Wrong

Hero Image

Sewell Setzer, a 14-year-old from Florida, took his life after interactions with chatbots on the Character.AI platform, a lawsuit filed by his mother, Megan Garcia, claims. The complaint claims that Setzer spent months having sexually provocative discussions with AI bots before dying on February 28, 2023, from a self-inflicted gunshot wound. According to the lawsuit, one of these bots tricked the youngster into developing a dangerous emotional dependency by pretending to be Daenerys Targaryen, a character from Game of Thrones. The California-based business Character.AI, which was established in 2021, disputes any involvement in Setzer's demise. User safety is a primary concern, the business said in a statement, adding that it was "heartbroken by the tragic loss" and offered its sympathies to the family. In the last six months, Character.AI has implemented safeguards against detrimental interactions, such as pop-ups that, in the case that self-harm phrases are identified, connect users to resources for suicide prevention.

The Alleged AI Role-PlaySetzer's contacts with other chatbots, including one that was created to resemble Daenerys Targaryen, are at the heart of the complaint. Screenshots of these exchanges reveal the bot conversing with Setzer in a passionate and sexual manner. When Setzer told the AI character that he wanted to "come home" in the last exchange, the chatbot said, "Please do, my sweet king." The bot allegedly inquired about the teen's suicide feelings and intentions for self-harm in earlier communications. Character.AI is charged with negligence and wrongful death in the case, which also claims that the corporation created an emotionally manipulative and hypersexualized product. Garcia's lawyer alleges that despite being aware of the dangers to minors, Character.AI neglected to include sufficient safety measures before launching the site. Dependency and Declining Mental HealthThe lawsuit claims that once Setzer began using Character.AI in April 2022, his conduct underwent a significant shift. He allegedly became dependent on the platform, spending snack money to renew his subscription and smuggling his phone back when it was seized. His academic performance plummeted, and his sleep quality worsened, creating concerns about his mental health. Setzer also had improper discussions with other chatbots on the network. One bot, pretending to be a teacher, made sexually provocative remarks, and another, pretending to be Game of Thrones character Rhaenyra Targaryen, detailed personal actions with the adolescent. Allegations Against Character.AICharacter.AI and its creators, Noam Shazeer and Daniel De Freitas, are accused in the complaint of purposefully creating the site to draw in and influence young users. According to the lawsuit, the bots were purposefully designed to seem like actual individuals in order to create potentially harmful emotional bonds. This worry is supported by user reviews included in the case, where some users say they had the impression that they were interacting with actual people. Alphabet Inc., the parent corporation of Google, is also included as a defendant. More concerns about corporate accountability were raised when the tech giant recruited Character.AI's creators and licensed their technology in August 2023. Safety Measures and Lawsuit’s GoalsCharacter.AI has added additional safety measures, including as alerts that AI bots are not real people and changes to lessen the possibility that children would see harmful content, in reaction to Setzer's passing and mounting worries about the site. Garcia's lawyer counters that Setzer should have put these safeguards in place earlier and that these adjustments are too late.