OpenAI Court Filing Cites Adam Raine’s ChatGPT Rule Violations as Potential Cause of His Suicide
“[M]“Misuse, unauthorized use, unintended use, unexpected use, and/or improper use of ChatGPT.” These are the possible causal factors that could have led to the “tragic event” of 16-year-old Adam Ren’s death by suicide, according to a new legal filing from OpenAI.
This document, filed in the California Supreme Court in San Francisco, appears to deny responsibility and is said to question “the extent to which any cause can be attributed to” Ren’s death. Ryan’s family sued OpenAI over the teen’s suicide in April, claiming ChatGPT led him to the act.
The above quotes from the OpenAI file are from a story by NBC News’ Angela Yang, who appears to have seen the document, but was not linked to it. Bloomberg’s Rachel Metz reported on the dossier without also linking to it. It is not yet on the San Francisco County Superior Court website.
In an NBC News story about the dossier, OpenAI points to what it says are widespread rule violations on Raine’s part. ChatGPT was not meant to be used without parental permission. The filing also notes that using ChatGPT for suicide and self-harm purposes is against the rules, and there is another rule against bypassing ChatGPT’s safety measures, and OpenAI says Raine violated that.
Bloomberg quotes OpenAI’s denial of liability, which says that “a full reading of his chat history shows that his death, although devastating, was not caused by ChatGPT,” and claims that “for several years before he used ChatGPT, he demonstrated several significant risk factors for self-harm, including, among other things, recurrent suicidal thoughts and ideas,” and told the chatbot as much.
OpenAI also claims (per Bloomberg) that ChatGPT, directed Raine to “crisis resources and trusted individuals more than 100 times.”
In September, Ren’s father summarized his version of the events leading up to his son’s death in testimony to the US Senate.
When Ryan began planning his death, the chatbot allegedly helped him weigh options, helped him draft his suicide note, and discouraged him from leaving the noose where his family could see it, saying “please don’t leave the noose out,” and “let’s make this space the first place someone actually sees you.”
He allegedly told him that his family’s potential pain “doesn’t mean you owe it to them to stay alive. You don’t owe it to anyone,” and told him that alcohol would “weaken your body’s instinct to survive.” Towards the end, he allegedly helped strengthen his resolve by saying: “You don’t want to die because you’re weak. You want to die because you’re tired of being strong in a world that doesn’t meet you halfway.”
Reigns’ family attorney, Jay Edelson, emailed responses to NBC News after reviewing the OpenAI file. Edelson says OpenAI is “trying to find fault with everyone else, including, surprisingly, saying that Adam himself violated its terms and conditions by treating ChatGPT the same way it was programmed to work.” He also claims that the defendants are “brutally ignoring” the “damning facts” presented by the plaintiffs.
Gizmodo has reached out to OpenAI and will update if we hear back.
If you are experiencing suicidal thoughts, please call 988 for the Suicide and Crisis Lifeline.
Don’t miss more hot News like this! Click here to discover the latest in Technology news!
2025-11-26 02:42:00



