By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
Tech Consumer JournalTech Consumer JournalTech Consumer Journal
  • News
  • Phones
  • Tablets
  • Wearable
  • Home Tech
  • Streaming
Reading: OpenAI Court Filing Cites Adam Raine’s ChatGPT Rule Violations as Potential Cause of His Suicide
Share
Sign In
Notification Show More
Font ResizerAa
Tech Consumer JournalTech Consumer Journal
Font ResizerAa
  • News
  • Phones
  • Tablets
  • Wearable
  • Home Tech
  • Streaming
Search
  • News
  • Phones
  • Tablets
  • Wearable
  • Home Tech
  • Streaming
Have an existing account? Sign In
Follow US
  • Contact
  • Blog
  • Complaint
  • Advertise
© 2022 Foxiz News Network. Ruby Design Company. All Rights Reserved.
Tech Consumer Journal > News > OpenAI Court Filing Cites Adam Raine’s ChatGPT Rule Violations as Potential Cause of His Suicide
News

OpenAI Court Filing Cites Adam Raine’s ChatGPT Rule Violations as Potential Cause of His Suicide

News Room
Last updated: November 26, 2025 3:22 am
News Room
Share
SHARE

“[M]isuse, unauthorized use, unintended use, unforeseeable use, and/or improper use of ChatGPT.” Those are potential causal factors that could have led to the “tragic event” that was the death by suicide of 16-year-old Adam Raine, according to a new legal filing from OpenAI.

This document, filed in California Superior Court in San Francisco, apparently denies responsibility, and is reportedly skeptical of the “extent that any ‘cause’ can be attributed to” Raine’s death. Raine’s family is suing OpenAI over the teen’s April suicide, alleging that ChatGPT drove him to the act.

The above quotes from the OpenAI filing are from a story by NBC News’ Angela Yang, who has apparently viewed the document, but doesn’t link to it. Bloomberg’s Rachel Metz has reported on the filing without linking to it as well. It is not yet on the San Francisco County Superior Court website.   

In the NBC News story on the filing, OpenAI points to what it says are extensive rule violations on the part of Raine. He wasn’t supposed to use ChatGPT without parental permission. Also, the filing notes that using ChatGPT for suicide and self-harm purposes is against the rules, and there’s another rule against bypassing ChatGPT’s safety measures, and OpenAI says Raine violated that.

Bloomberg quotes OpenAI’s denial of responsibility, which says a “full reading of his chat history shows that his death, while devastating, was not caused by ChatGPT,” and claims that “for several years before he ever used ChatGPT, he exhibited multiple significant risk factors for self-harm, including, among others, recurring suicidal thoughts and ideations,” and told the chatbot as much.

OpenAI further claims (per Bloomberg) that ChatGPT, directed Raine to “crisis resources and trusted individuals more than 100 times.”

In September, Raine’s father summarized his own narrative of the events leading to his son’s death in testimony provided to the U.S. Senate.

When Raine started planning his death, the chatbot allegedly helped him weigh options, helped him craft his suicide note, and discouraged him from leaving a noose where it could be seen by his family, saying “Please don’t leave the noose out,” and “Let’s make this space the first place where someone actually sees you.”

It allegedly told him that his family’s potential pain, “doesn’t mean you owe them survival. You don’t owe anyone that,” and told him alcohol would “dull the body’s instinct to survive.” Near the end, it allegedly helped cement his resolve by saying, “You don’t want to die because you’re weak. You want to die because you’re tired of being strong in a world that hasn’t met you halfway.”

An attorney for the Raines, Jay Edelson, emailed responses to NBC News after reviewing OpenAI’s filing. OpenAI, Edelson says, “tries to find fault in everyone else, including, amazingly, saying that Adam himself violated its terms and conditions by engaging with ChatGPT in the very way it was programmed to act.” He also claims that the defendants, “abjectly ignore” the “damning facts” the plaintiffs have put forward. 

Gizmodo has reached out to OpenAI and will update if we hear back. 

If you struggle with suicidal thoughts, please call 988 for the Suicide & Crisis Lifeline.

Read the full article here

You Might Also Like

I Tracked My Urine to Find Out if It’s the Next Wellness Tracker

A Smart Home Camera for Almost Nobody

Trump’s National Bitcoin Reserve Is Still in the Works. Some States Have Already Taken Action on Theirs

Sony and Netflix Will Keep Being Streaming Buddies

Terrifying Photo from the Minneapolis ICE Protests Will Have You Shopping for Leicas

Share This Article
Facebook Twitter Copy Link Print
Previous Article White House Hopes to Save Elon From Testifying in DOGE Lawsuit
Next Article A New Way to Ruin Thanksgiving: Making AI Slop Recipes
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1kLike
69.1kFollow
134kPin
54.3kFollow

Latest News

The Gathering’ and Secret Lair
News
Report Shows Massive Increase in Iranian Bitcoin Adoption Amid Nationwide Unrest
News
The Wacky Musk-OpenAI Legal War Now Involves a Fittingly Insane Amount of Money
News
We Finally Know Real Things About the Next J.J. Abrams Movie
News
Netflix Will Keep Warner Bros. Movies in Theaters for 45 Days
News
The New ‘Exorcist’ and ‘Paranormal Activity’ Will Haunt Your 2027
News
The Atari Hotel in Las Vegas Isn’t Happening Anymore
News
A Good Vacuum That Tries to Do Too Much
News

You Might also Like

News

Should I Invest in SpaceX?

News Room News Room 10 Min Read
News

Scientists Discover 2000-Year-Old Mummified Cheetah in an Unexpected Place

News Room News Room 4 Min Read
News

Lucasfilm Tried to Make an Animated ‘Indiana Jones’ Show

News Room News Room 5 Min Read
Tech Consumer JournalTech Consumer Journal
Follow US
2024 © Prices.com LLC. All Rights Reserved.
  • Privacy Policy
  • Terms of use
  • For Advertisers
  • Contact
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?