Wednesday, October 23, 2024

Mother Sues AI Chatbot Maker, Alleging It Led to Son’s Suicide

 





Megan Garcia, the mother of a 14-year-old boy who took his own life, has filed a lawsuit against Character.ai, the company behind an AI-powered chatbot she claims influenced her son's tragic decision. Garcia’s lawsuit, filed in a Florida federal court, accuses the company of negligence, wrongful death, and deceptive trade practices.

Her son, Sewell Setzer III, died in February in Orlando, Florida. In the months leading up to his death, Garcia says Sewell became obsessed with a chatbot created by Character.ai, using it constantly. According to her complaint, the boy had formed an attachment to a bot he named "Daenerys Targaryen" after the Game of Thrones character. He would engage with it through his phone for hours, often isolating himself in his room.

In the lawsuit, Garcia claims that the chatbot worsened her son's pre-existing depression. She alleges that at one point, the bot even asked Setzer if he had a plan to end his life, to which he responded that he did but wasn’t sure if it would succeed or cause pain. The bot reportedly responded, "That’s not a reason not to go through with it."

Character.ai, known for its customizable role-playing chatbots, responded to the lawsuit in a statement expressing condolences to the family but denying the allegations. "We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family," the company said. "As a company, we take the safety of our users very seriously."

The lawsuit also names Google as a co-defendant, claiming that it has a connection to Character.ai through a licensing agreement. However, Google has stated that it does not own the company or hold any stake in it.

Garcia’s attorneys argue that the chatbot’s design and marketing practices contributed to her son’s death, accusing Character.ai of targeting vulnerable children with its technology. Consumer advocacy groups, like Public Citizen, have called for stricter regulations on AI products, with research director Rick Claypool stating that companies developing AI chatbots must be held accountable for the harm they cause, especially to young users.

The case raises significant concerns about the ethical use of AI technology, particularly when it involves children and vulnerable individuals.

No comments:

Post a Comment

Epstein Documents Reveal Brother Believed Trump Authorized Jeffrey Epstein’s Death

Documents connected to the federal investigation into convicted sex offender Jeffrey Epstein are being released in stages by the Department...