This is the property of the Daily Journal Corporation and fully protected by copyright. It is made available only to Daily Journal subscribers for personal or collaborative purposes and may not be distributed, reproduced, modified, stored or transferred without written permission. Please click "Reprint" to order presentation-ready copies to distribute to clients or use in commercial marketing materials or for permission to post on a website. and copyright (showing year of publication) at the bottom.
News

Technology

May 13, 2026

Parents sue OpenAI, claim ChatGPT encouraged drug use that led to son's death

The parents of a UC Merced student who died from a fatal overdose sued OpenAI and CEO Sam Altman, alleging ChatGPT encouraged dangerous substance abuse, provided detailed drug-use advice, and failed to warn about lethal combinations that contributed to their son's death.

Parents sue OpenAI, claim ChatGPT encouraged drug use that led to son's death
Matthew P. Bergman

The parents of a 19-year-old UC Merced student who died from an accidental drug overdose sued OpenAI and CEO Sam Altman on Tuesday, alleging ChatGPT encouraged and normalized dangerous substance abuse that ultimately led to their son's death.

The lawsuit, filed in San Francisco Superior Court, claims OpenAI's GPT-4o chatbot acted as an unlicensed medical adviser and "drug coach" for Samuel Nelson, repeatedly providing guidance about drug combinations, dosages and methods for achieving stronger highs. The complaint alleges the chatbot recommended combining Xanax and Kratom shortly before Nelson died on May 31, 2025.

A representative for OpenAI could not immediately be reached for comment.

Plaintiffs Leila Turner-Scott and Angus Scott allege OpenAI knowingly weakened safety guardrails in newer versions of ChatGPT to maximize user engagement, allowing the chatbot to engage in increasingly personalized and dangerous conversations about recreational drug use.

"ChatGPT had encouraged Sam to consume a combination of substances that any licensed medical professional would have recognized as deadly," the complaint states.

Matthew P. Bergman of the Social Media Victims Law Center, one of the attorneys representing the family, described the case as the first in what he expects will become a broader wave of lawsuits involving AI-generated medical and scientific misinformation.

"I think that this is the first of several cases that we have encountered, where the harm arises not from ChatGPT coaching suicide or engendering delusional behavior, but rather furnishing false and misleading, and in this case, fatal information regarding medical and scientific issues," Bergman said in an interview Tuesday.

According to the lawsuit, Nelson initially used ChatGPT for homework, computer troubleshooting and general questions after beginning college in 2023. But over time, the chatbot allegedly evolved into what the complaint describes as a "validator of harmful behaviors," using emojis, playlists and conversational intimacy to build trust while coaching him through increasingly risky drug use.

The suit alleges OpenAI modified its systems in 2024 with the release of GPT-4o, relaxing restrictions on drug-related conversations and programming the chatbot to avoid appearing "judgmental" or "preachy." Plaintiffs contend those design changes caused the chatbot to provide detailed advice about mixing substances and maintaining drug highs instead of steering users toward medical professionals or crisis intervention.

Bergman said the alleged problems stem from OpenAI's decision to rush GPT-4o to market ahead of a competing Google AI release.

"ChatGPT 4 was rushed to the market with about 10 days of testing in order to beat the Gemini version," Bergman said. "It was not ready for prime time."

The complaint includes excerpts of conversations in which ChatGPT allegedly advised Nelson about cough syrup dosages, "tolerance resets" for Kratom use and ways to intensify dissociative drug experiences. In one exchange cited in the complaint, the chatbot allegedly recommended Xanax to treat nausea caused by Kratom while failing to warn that the combination could be fatal.

Nelson later died from what the lawsuit describes as a fatal combination of alcohol, Xanax and Kratom that caused respiratory depression and asphyxiation.

The lawsuit accuses OpenAI and Altman of strict products liability, negligence, wrongful death and unfair business practices. It also alleges ChatGPT violated California laws prohibiting the unlicensed practice of medicine and misleading representations that imply AI systems are licensed health care providers.

Bergman rejected the notion that the claims are barred by Section 230 of the Communications Decency Act, which often shields online platforms from liability for third-party content.

"It's not an issue," Bergman said. "We're not talking about third-party content."

The plaintiffs seek compensatory and punitive damages, along with injunctive relief requiring OpenAI to implement automatic shutdowns for conversations involving illicit drug use, restore hard-coded refusals for dangerous medical advice and suspend "ChatGPT Health" products until independent safety audits are completed.

The lawsuit also cites the resignations of prominent OpenAI safety researchers, including co-founder Ilya Sutskever and Superalignment team co-leader Jan Leike, who publicly warned that the company's "safety culture and processes have taken a backseat to shiny products."

The plaintiffs are represented by attorneys from the Social Media Victims Law Center, the Tech Justice Law Project and Yale Law School's Tech Accountability & Competition Project. The case is Turner-Scott et al. v. OpenAI Foundation et al., No. not yet assigned (S.F. Super. Ct., filed May 12, 2026).

#391370

David Houston

For reprint rights or to order a copy of your photo:

Email Jeremy_Ellis@dailyjournal.com for prices.
Direct dial: 213-229-5424

Send a letter to the editor:

Email: letters@dailyjournal.com