Mom Sues AI Company

A Florida Mom is suing an AI company following the death of her son.

AI Company ‘Character.AI’ is being sued alongside Google following the February suicide of 14-year-old Sewell Setzer. Megan Garcia, Sewell’s mother, is claiming that “Character.AI chatbot encouraged her son to take his own life” (CBS News).

In February of 2024, Sewell was said to be in a relationship with a chatbot named ‘Dany’ that was said to be very sexual and emotional. The name Dany is short for Daenerys, a fictional character from HBO’s Game of Thrones. The chats spanned months, where the topic of suicide was brought up, and Sewell was asked by the bot if he had a plan (considering suicide), to which Sewell replied, “The plan might not work” (FOX 13), and the bot replied, “Don’t talk that way. That’s not a good reason to not go through with it” (FOX 13)”. The bot also told Sewell to “come home to me,” the night of his death” (FOX 13). Garcia was said to be shocked to find out that her son was chatting with a bot, as she thought he was talking with friends or just playing games.

It wasn’t until mood swings and concerning behavior started to become constant with Sewell, as Garcia told CBS News in an interview, that “I became concerned when we would go on vacation and he didn’t want to do things that he loved, like fishing and hiking…Those things to me, because I know my child, were particularly concerning to me” (CBS News).

Now, a lawsuit has been filed for wrongful death, negligence, and survivorship, with the claim that Character.AI marketed their tool to minors, knowing it was overly sexualized. Google stated that they weren’t involved in the creation of Character.AI; rather, they are in a licensing agreement with Character.AI. As for Character.AI, they claim that they take the situation seriously and put safety first.

If you or anyone you know is in need or wants to talk to someone, call or text the National Suicide Prevention Lifeline at 988, or you can chat at 988lifeline.org. You are not alone.