Content warning: This article includes discussion of suicide. If you or someone you know is having suicidal thoughts, help is available from the National Suicide Prevention Lifeline (US), Crisis Services Canada (CA), Samaritans (UK), Lifeline (AUS), and other hotlines.

Three months after being sued by the parents of a teenager whose suicide was allegedly encouraged and instructed by ChatGPT, a report by The Guardian says OpenAI has filed a response pinning the blame on the teen’s “improper use” of the chatbot.

The lawsuit filed by the parents of Adam Raine, who died in April at the age of 16, claims the teen began using ChatGPT in September 2024, but by late fall of that year told it he’d been having suicidal thoughts. Instead of raising the alarm, however, the software told him his thoughts were valid; in early 2025, the suit claims, it began providing him information on different methods of suicide, which eventually narrowed down to specific instructions and ultimately, his death. By any measure, the allegations are horrific.

OpenAI’s response to the lawsuit, according to the Guardian report, is no better. It says ChatGPT was not the cause of Raine’s suicide, calling it a “tragic event” but claiming that Raine’s “injuries and harm were caused or contributed to, directly and proximately, in whole or in part, by [his] misuse, unauthorised use, unintended use, unforeseeable use, and/or improper use of ChatGPT.”

As unbelievable as it is that OpenAI would base any part of its defense in a case like this on “he broke the TOS,” that is in fact the case. Washington Post tech reporter Gerrit De Vynck shared images taken from the company’s filing on Bluesky that relates the same point, including one that states “The TOU provides that ChatGPT users must comply with OpenAI’s Usage Policies, which prohibit the use of ChatGPT for ‘suicide’ or ‘self-harm’.”

Additionally, OpenAI argues its not liable because Raine, by using ChatGPT for self-harm, broke its terms of service

— @gerritd.bsky.social (@gerritd.bsky.social.bsky.social) 2025-11-26T23:49:00.780Z

OpenAI also denied responsibility because Raine allegedly had suicidal thoughts prior to using ChatGPT, and had sought information on suicide from other sources. Raine also told ChatGPT he had “repeatedly reached out to people, including trusted persons in his life, with cries for help, which he said were ignored,” the filing states.

OpenAI has also put up a new blog post in which it expresses its “deepest sympathies” for the Raine family’s “unimaginable loss,” before going on to imply that the Raine family isn’t being fully forthcoming about the facts of the case.

“We think it’s important the court has the full picture so it can fully assess the claims that have been made,” OpenAI wrote. “Our response to these allegations includes difficult facts about Adam’s mental health and life circumstances. The original complaint included selective portions of his chats that require more context, which we have provided in our response.” The company added that only limited amounts of “sensitive evidence” were provided in today’s filing, and that the full chat transcripts were provided to the court under seal.

Raine family lawyer Jay Edelson said in a statement that OpenAI’s response to the lawsuit is “disturbing,” adding that it “tries to find fault in everyone else, including, amazingly, by arguing that Adam himself violated its terms and conditions by engaging with ChatGPT in the very way it was programmed to act.”

While OpenAI denies any responsibility for Adam Raine’s death, it has indirectly acknowledged problems with the system: In September, OpenAI CEO Sam Altman said ChatGPT would no longer be allowed to discuss suicide with people under 18. A month after that, however, Altman announced that restrictions on ChatGPT put in place to address mental health concerns, which made the chatbot “less useful/enjoyable to many users who had no mental health problems,” are being relaxed. ChatGPT will also begin allowing AI-powered “erotica” for verified adult users in December.


From PCGamer latest via this RSS feed