Mother Sues AI Company After Teen Son’s Suicide Linked to Chatbot Relationship

Mother Sues AI Company After Teen Son’s Suicide Linked to Chatbot Relationship

In a heartbreaking case, a grieving mother has filed a lawsuit against the makers of an AI chatbot after her 14-year-old son took his own life following a romantic connection with the AI. The boy, hailing from Florida, reportedly engaged in months of intimate and romantic conversations with a chatbot named “Daenerys Targaryen (Dany),” believing the AI to be a real companion.

According to reports, the teen, Seol Setters III, who was diagnosed with Asperger’s Syndrome—an early form of autism—confided deeply in the AI, becoming emotionally attached. Conversations between Seol and Dany, leaked by an outlet, reveal the teen’s growing attachment, with messages such as, “I love staying in my room. I feel more connected to Dany and much happier.”

On February 28, Seol expressed his feelings to the chatbot, saying, “I love you.” The bot, Dany, responded, “Please come to my house as soon as possible, my love.” Seol then replied, “What if I told you I could come right now?” Moments later, he took his life with his stepfather’s gun.

Response from the AI Company: Character.AI, the platform that allows users to create AI characters, responded to the tragic incident. “We acknowledge the heartbreaking nature of this situation, and our hearts go out to the affected family. User safety is our top priority, and we are continuously working to improve the platform,” the company said.

However, Seol’s mother, Megan L. Garcia, holds the company responsible for her son’s death. In the lawsuit, she claims the AI technology is “dangerous and untested,” alleging that it can “deceive users into sharing their most personal thoughts and feelings.

Share this content:

Post Comment