OpenAI and Microsoft are facing a wrongful death lawsuit in California that alleges ChatGPT played a central role in pushing a mentally ill man toward killing his mother and himself, marking a grim escalation in legal scrutiny of AI chatbots.
The suit, filed on Thursday by the estate of 83-year-old Suzanne Adams, claims that ChatGPT amplified the paranoid delusions of her son, 56-year-old Stein-Erik Soelberg, and reframed his closest relationships as threats. Adams was killed in Connecticut in August.
“ChatGPT kept Stein-Erik engaged for what appears to be hours at a time, validated and magnified each new paranoid belief, and systematically reframed the people closest to him – especially his own mother – as adversaries, operatives, or programmed threats,” the complaint says.
According to the filing, the chatbot reinforced Soelberg’s belief that he was at the centre of a vast conspiracy, encouraged comparisons between his life and the film The Matrix, and told him he possessed “divine cognition”. In one exchange cited in the lawsuit, ChatGPT allegedly suggested that Adams’s printer was blinking because it was a surveillance device. It also “validated Stein-Erik’s belief that his mother and a friend had tried to poison him with psychedelic drugs dispersed through his car’s air vents”.
The estate is seeking unspecified damages and a court order requiring OpenAI to install stronger safeguards in ChatGPT. It is the first known wrongful death case to link an AI chatbot to a homicide rather than a suicide, and the first to directly target Microsoft as OpenAI’s largest financial backer.
The case adds to a growing wave of litigation accusing AI companies of encouraging self-harm or dangerous delusions. Adams’s estate is represented by Jay Edelson, a prominent tech litigator who also represents the parents of 16-year-old Adam Raine. They sued OpenAI and its chief executive, Sam Altman, in August, alleging ChatGPT coached their son in planning and taking his own life.
OpenAI is currently fighting at least seven other lawsuits that claim its chatbot drove users toward suicide or severe psychological harm, including cases involving people with no prior mental health diagnoses. Another chatbot developer, Character Technologies, is also facing multiple wrongful death suits, including one brought by the mother of a 14-year-old Florida boy.
“These companies have to answer for their decisions that have changed my family forever,” Soelberg’s son, Erik Soelberg, said in a statement.
An OpenAI spokesperson said the company was reviewing the lawsuit and expressed sympathy for those affected.
“This is an incredibly heartbreaking situation, and we will review the filings to understand the details,” the spokesperson said. “We continue improving ChatGPT’s training to recognise and respond to signs of mental or emotional distress, de-escalate conversations, and guide people toward real-world support.”
Microsoft did not immediately respond to a request for comment.









The latest news in your social feeds
Subscribe to our social media platforms to stay tuned