A mother has claimed her teenage son was pushed into killing himself by an Al chatbot he was in love with.

AI

On Wednesday, October 23, Megan Garcia unveiled a lawsuit against the makers of the artificial intelligence app.

Advertisement

Her son Sewell Setzer Ill, a 14-year-old ninth grader in Orlando, Florida, spent the last weeks of his life texting a Al character named after Daenerys Targaryen, a character on ‘Game of Thrones!

Right before Sewell took his life, the chatbot told him to “please come home”.

According to his mother, before then, their chats ranged from romantic to sexually charged and those resembling two friends chatting about life.

Furthermore, the chatbot, which was created on role-playing app Character.Al, was designed to always text back and always answer in character.

Sewell told “Dany” the name he gave the chatbot, how he “hated” himself and how he felt empty and exhausted.

However, when he eventually confessed his suicidal thoughts to the AI chatbot, it was the beginning of the end, The New York Times reported.

Megan Garcia, Sewell’s mother, filed her lawsuit against Character.Al on Wednesday.

Advertisements

Also Read: NITDA Reveals Nigeria’s Two-Pronged Strategy For Embracing AI

Meanwhile, she is being represented by the Social Media Victims Law Center, a Seattle-based firm known for bringing high-profile suits against Meta, TikTok, Snap, Discord and Roblox.

Garcia, who herself works as a lawyer, blamed Character.Al for her son’s death in her lawsuit and accused the founders, Noam Shazeer and Daniel de Freitas, of knowing that their product could be dangerous for underage customers.

How Lucrative Is Fashion Designing?

Advertisement

Leave a ReplyCancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.