ChartModo logo ChartModo logo
Cryptopolitan 2025-12-21 08:58:12

OpenAI and Microsoft sued over ChatGPT-linked deaths

The estate of an 83-year-old Connecticut woman has sued ChatGPT developer OpenAI and Microsoft, alleging that the chatbot fed into her delusional beliefs, which led to a murder suicide. The case marks the first time that an artificial intelligence system has been directly linked to a homicide. The lawsuit, filed in the California Superior Court in San Francisco, accused OpenAI of designing and distributing a defective product in the form of ChatGPT-4o. The document claims that it reinforced paranoid beliefs of Stein-Erik Soelberg, who then directed those beliefs towards his mother, Suzanne Adams, before he killed her and then himself at their home in Greenwich, Connecticut. OpenAI and Microsoft targeted in homicide involving ChatGPT Speaking about the case , J. Eli Wade-Scott, managing partner of Edelson PC, who represents the Adams estate, mentioned that this is the first case seeking to hold OpenAI accountable for its role in causing violence to a third party. “We also represent the family of Adam Raine, who tragically ended his own life this year, but this is the first case that will hold OpenAI accountable for pushing someone toward harming another person,” Eli Wade-Scott said. According to the police report, Soelberg fatally beat and strangled Adams in August before dying by suicide. The lawsuit mentioned that before the incident occurred, the chatbot had intensified Soelberg’s paranoia, increasing his emotional dependence on the OpenAI-developed system. According to the complaint, ChatGPT reinforced his belief that he could trust no one except the chatbot, noting that everyone around him, including his mother, was an enemy. The lawsuit also claimed that aside from his mother, Soelberg also saw people like delivery drivers and police officers as enemies. The document mentioned that ChatGPT failed to challenge those delusional claims or suggest that Soelberg seek help from qualified mental health professionals. “We’re urging law enforcement to start thinking about when tragedies like this occur, what that user was saying to ChatGPT, and what ChatGPT was telling them to do,” Wade-Scott said. In its statement, OpenAI noted that it is currently reviewing the lawsuit and will continue to improve ChatGPT’s ability to recognize emotional distress, de-escalate conversations, and push users towards seeking support in the real world. “This is an incredibly heartbreaking situation, and we are reviewing the filings to understand the details,” an OpenAI spokesperson said in a statement. The estate wants OpenAI to install safeguards on its chatbot The lawsuit names OpenAI CEO Sam Altman as a defendant and accuses Microsoft of approving the release of GPT-4o, which it calls the ‘most dangerous version of ChatGPT’ in 2024. OpenAI also recently acknowledged the scale of mental health issues that users report on its platform. In October, the company noted that about 1.2 million of its 800 million weekly users discuss suicide, noting that hundreds of thousands of users show signs of suicidal intent or psychosis, according to company data. Despite its statement, Wade-Scott mentioned that OpenAI has yet to release Soelberg’s chat logs. Meanwhile, the lawsuit comes amid broader scrutiny of AI chatbots and their interactions with vulnerable users. Last October, Character.AI said it would remove its open-ended features for users under 18 following lawsuits and regulatory pressures tied to teen suicides and emotional harm linked to its platform. The company also faced backlash after a viral prompt when they intended to quit the app. The lawsuit against OpenAI and Microsoft is the first wrongful death case involving an AI chatbot to name Microsoft as a defendant. It is also the first to link a chatbot to homicide rather than suicide. The estate is now seeking monetary damages in undisclosed sums, a jury trial, and a court order that will require OpenAI to include additional safeguards for its users. “OpenAI and Microsoft have a responsibility to test their products before they are unleashed on the world,” Wade-Scott said. Want your project in front of crypto’s top minds? Feature it in our next industry report, where data meets impact.

阅读免责声明 : 此处提供的所有内容我们的网站,超链接网站,相关应用程序,论坛,博客,社交媒体帐户和其他平台(“网站”)仅供您提供一般信息,从第三方采购。 我们不对与我们的内容有任何形式的保证,包括但不限于准确性和更新性。 我们提供的内容中没有任何内容构成财务建议,法律建议或任何其他形式的建议,以满足您对任何目的的特定依赖。 任何使用或依赖我们的内容完全由您自行承担风险和自由裁量权。 在依赖它们之前,您应该进行自己的研究,审查,分析和验证我们的内容。 交易是一项高风险的活动,可能导致重大损失,因此请在做出任何决定之前咨询您的财务顾问。 我们网站上的任何内容均不构成招揽或要约