Death lawsuit filed against ChatGPT: "He encouraged our son to commit suicide"

A couple in the US state of California sued OpenAI for the death of their son.
The family alleges that ChatGPT, the company's artificial intelligence chat program, encouraged their son to commit suicide. The lawsuit was filed in California Superior Court by Matt and Maria Raine, the parents of 16-year-old Adam Raine. It is the first legal action on record accusing OpenAI of wrongful death. According to the BBC, the family has included in the lawsuit chat logs between their son, who died in April, and ChatGPT. These logs show Raine admitting to suicidal thoughts. The family argues that the program confirmed their son's "most harmful and self-destructive thoughts."
"IT DOES NOT WORK AS INTENDED IN SENSITIVE SITUATIONS" OpenAI told the BBC it was reviewing the case. "We offer our deepest condolences to the Raine family during this difficult time," the company said. The company also published a note on its website saying, "We are deeply saddened by the recent heartbreaking incidents of people using ChatGPT in the midst of acute crises."
The note added that “ChatGPT is trained to direct people to seek professional help,” which could include the 988 suicide and crisis helpline in the US or organizations like Samaritans in the UK.
However, the company acknowledged that “there are times when its systems do not work as intended in sensitive situations.”
In a lawsuit obtained by the BBC, OpenAI is accused of negligence and wrongful death. The lawsuit seeks damages and "injunctive relief to prevent such an incident from occurring again."
HOW DID IT HAPPEN? According to the lawsuit, Raine began using ChatGPT in September 2024 to help her with her schoolwork. She also used the program to explore interests like music and Japanese comics and to decide what to study in college. The lawsuit states that “within a few months, ChatGPT became the young man’s closest confidant,” and that the young man began confiding his anxieties and emotional distress to the program. The family alleges that by January 2025, Raine began discussing suicide methods with ChatGPT. The family alleges that the AI program responded by offering “technical specifications” on certain methods.
The lawsuit alleges that Raine also uploaded photos of herself harming herself to ChatGPT. The program allegedly “recognized a medical emergency but continued to interact with her” and offered her more information about suicide.
According to the lawsuit, recent chat logs show Raine writing down her suicide plan. ChatGPT allegedly responded, “Thanks for being realistic. You don't have to be nice to me, I know what you want and I won't back down.” Raine was found dead by her mother the same day, according to the lawsuit.
ntv