
Seven (7) separate lawsuits have apparently been filed against OpenAI by people who allege that the company’s artificial intelligence (AI) chatbot, ChatGPT, caused users to suffer bodily and psychological injury. Four (4) of the seven claims were apparently filed on Thursday for wrongful death, and the remaining three (3) claim the chatbot caused their mental collapse. Only one week has passed since the San Francisco-based AI giant implemented further security measures to ChatGPT for customers going through a serious mental health crisis.
The cases, which were filed in California state courts on Thursday, claim negligence, involuntary manslaughter, assisted suicide, and wrongful death. The Social Media Victims Law Centre and Tech Justice Law Project filed the cases on behalf of six adults and one juvenile, alleging that OpenAI intentionally launched GPT-4o too soon in spite of internal warnings that it was psychologically manipulative and dangerously sycophantic. Four of the victims committed suicide.
The New York Times had reported that seven complaints have been filed in state courts in California, alleging that ChatGPT is a defective product. One of the four wrongful death cases allegedly states that Georgia resident Amaurie Lacey, 17, discussed suicide intentions with the chatbot for a month before to his August death.
Joshua Enneking, a 26-year-old Florida resident, is the subject of another case. According to his mother, Enneking asked ChatGPT how to hide his suicidal thoughts from the company’s human reviewers. The family of Texas resident Zane Shamblin, 23, has reportedly filed a lawsuit, alleging that the chatbot gave him encouragement before to his July suicide death. The fourth case was filed by the wife of 48-year-old Oregonian Joe Ceccanti, who allegedly had two psychotic episodes and committed suicide in August after coming to believe ChatGPT was sentient.
Three further lawsuits from people who claimed ChatGPT was responsible for their mental breakdowns were also described in the article. According to reports, two of them, Hannan Madden, 32, and Jacob Irwin, 30, stated that the emotional anguish forced them to seek mental health treatment.
Another, 48-year-old Allan Brooks of Ontario, Canada, has stated that his delusions forced him to take temporary disability leave. The story claims that Brooks grew to believe he had developed a mathematical formula that might power legendary ideas and disrupt the Internet as a whole.
“These lawsuits are about accountability for a product that was designed to blur the line between tool and companion all in the name of increasing user engagement and market share,” the Social Media Victims Law Center’s founding attorney, Matthew P. Bergman, said in a statement.
He continued by saying that OpenAI “released GPT-4o without the safeguards needed to protect users, regardless of age, gender, or background, and designed it to emotionally entangle users.” He claimed that OpenAI jeopardised safety and put “emotional manipulation over ethical design” by pushing its product to market without sufficient protections in order to dominate the market and increase participation.
Adam Raine’s parents filed a lawsuit against OpenAI and its CEO, Sam Altman, in August, claiming that ChatGPT assisted the 16-year-old California kid in organising and carrying out his suicide earlier this year.
In a statement provided to the journal, an OpenAI representative described the occurrences as “incredibly heartbreaking,” adding, “We train ChatGPT to recognise and respond to signs of mental or emotional distress, de-escalate conversations, and guide people towards real-world support.” In close collaboration with mental health professionals, we continue to improve ChatGPT’s answers during delicate situations.
Daniel Weiss, chief advocacy officer at Common Sense Media, which was not involved in the complaints, stated, “The lawsuits filed against OpenAI reveal what happens when tech companies rush products to market without proper safeguards for young people.” “These tragic cases portray actual people whose lives were turned upside down or lost as a result of using technology meant to keep them occupied rather than to keep them safe.”
Discover more from TechBooky
Subscribe to get the latest posts sent to your email.






