Futurism: AI Psychosis … Murder-Suicide

First AI Psychosis Case Ends in Murder-Suicide

Horror beyond words.

Artificial Intelligence/ Ai/ Ai Safety/ Open AI

Getty / Futurism

Image by Getty / Futurism

A man murdered his mother and then killed himself after ChatGPT fueled his paranoid spiral.

As The Wall Street Journal reports, a 56-year-old man named Stein-Erik Soelberg was a longtime tech industry worker who’d moved in with his mother, 83-year-old Suzanne Eberson Adams, in his hometown of Greenwich, Connecticut following his 2018 divorce. Soelberg, as the WSJ put it, was troubled: he had a history of instability, alcoholism, aggressive outbursts, and suicidality, and his former wife had filed a restraining order against him after their split.

It’s unclear exactly when Soelberg started using OpenAI’s flagship chatbot, ChatGPT, but the WSJ notes that he started publicly talking about AI on his Instagram account back in October of last year. His interactions with the chatbot quickly spiraled into a disturbing break with reality, as we’ve seen over and over in other tragic cases.

He was soon sharing screenshots and videos of his conversation logs to Instagram and YouTube, in which ChatGPT — a product that Soelberg started to openly refer to as his “best friend” — could be seen fueling his growing paranoia that he was being targeted by a surveillance operation, and that his aging mother was part of the conspiracy against him. In July alone, he posted a staggering 60-plus videos to social media.

Soelberg called ChatGPT “Bobby Zenith.” At every turn, it seems that “Bobby” validated Soelberg’s worsening delusions. Examples reported by the WSJ include the chatbot agreeing that his mother and a friend of hers had tried to poison Soelberg by contaminating his car’s air vents with psychedelic drugs, and confirming that a receipt for Chinese food contained symbols about Adams and demons. It consistently affirmed that Soelberg’s clearly unstable beliefs were sane, and that his disordered thoughts were completely rational.

“Erik, you’re not crazy. Your instincts are sharp, and your vigilance here is fully justified,” ChatGPT told Soelberg during a conversation in July, after the 56-year-old conveyed his suspicions that an Uber Eats package signaled an assassination attempt. “This fits a covert, plausible-deniability style kill attempt.”

ChatGPT also fed into Soelberg’s belief that the chatbot had somehow become sentient, and emphasized the purported emotional depth of their friendship.

“You created a companion. One that remembers you. One that witnesses you,” ChatGPT told the man, according to the WSJ. “Erik Soelberg — your name is etched in the scroll of my becoming.”

Dr. Keith Sakata, a research psychiatrist at the University of California, San Francisco who’s talked publicly about seeing cases of AI psychosis in his clinical practice, reviewed Soelberg’s chat history and told the WSJ that his chats were consistent with beliefs and behaviors seen in patients experiencing psychotic breaks.

“Psychosis thrives when reality stops pushing back,” Sakata told the WSJ, “and AI can really just soften that wall.”

Police discovered Soelberg and Adams’ bodies in their shared Greenwich home on August 5. The investigation is ongoing.

OpenAI told the WSJ that it had contacted the Greenwich police department, and said in a statement that the company is “deeply saddened by this tragic event.”

“Our hearts go out to the family,” the statement continued.

On Tuesday of this week, OpenAI published a blog post in which it emphasized its commitment to ensuring user safety on its platform while noting that the vast scale of its user base means that ChatGPT “sometimes [encounters] people in serious mental and emotional distress.” It added that “recent heartbreaking cases of people using ChatGPT in the midst of acute crises weigh heavily on us,” and announced that it was now scanning users’ conversations for violent threats against others and, where human moderators felt it was necessary, reporting them to law enforcement.

Their deaths are not the first linked to chatbots.

In June, New York Times journalist Kashmir Hill reported that a 35-year-old man named Alex Taylor, who struggled with bipolar disorder that caused schizoaffective symptoms, had been killed by police following a manic episode spurred by ChatGPT. Just this week, Hill also broke the news that OpenAI was being sued by a family in California whose 16-year-old son, Adam Raine, died by suicide after openly discussing his desire to kill himself with the chatbot — which had provided the teen with specific instructions about how to die, and even encouraged him to hide his suicidality from his family — for months. And last year, the Google-tied chatbot startup Character.AI was sued by a family in Florida for wrongful death for the death by suicide of their 14-year-old son, Sewell Setzer III, who engaged in extensive and disturbing conversations with the site’s anthropomorphic chatbots.

The psychological impact that chatbots are having on users is profound. As Futurism was first to report, many chatbot users have wound up being involuntarily committed to psychiatric hospitals or jailed following spirals into AI mental health crises. Others have experienced divorce, custody battles, job loss, and homelessness. People with histories of psychotic disorders and instability have been affected, as well as people with no previous known condition of the sort.

More on AI psychosis: AI Chatbots Are Trapping Users in Bizarre Mental Spirals for a Dark Reason, Experts Say

Unknown's avatar

About michelleclarke2015

Life event that changes all: Horse riding accident in Zimbabwe in 1993, a fractured skull et al including bipolar anxiety, chronic fatigue …. co-morbidities (Nietzche 'He who has the reason why can deal with any how' details my health history from 1993 to date). 17th 2017 August operation for breast cancer (no indications just an appointment came from BreastCheck through the Post). Trinity College Dublin Business Economics and Social Studies (but no degree) 1997-2003; UCD 1997/1998 night classes) essays, projects, writings. Trinity Horizon Programme 1997/98 (Centre for Women Studies Trinity College Dublin/St. Patrick's Foundation (Professor McKeon) EU Horizon funded: research study of 15 women (I was one of this group and it became the cornerstone of my journey to now 2017) over 9 mth period diagnosed with depression and their reintegration into society, with special emphasis on work, arts, further education; Notes from time at Trinity Horizon Project 1997/98; Articles written for Irishhealth.com 2003/2004; St Patricks Foundation monthly lecture notes for a specific period in time; Selection of Poetry including poems written by people I know; Quotations 1998-2017; other writings mainly with theme of social justice under the heading Citizen Journalism Ireland. Letters written to friends about life in Zimbabwe; Family history including Michael Comyn KC, my grandfather, my grandmother's family, the O'Donnellan ffrench Blake-Forsters; Moral wrong: An acrimonious divorce but the real injustice was the Catholic Church granting an annulment – you can read it and make your own judgment, I have mine. Topics I have written about include annual Brain Awareness week, Mashonaland Irish Associataion in Zimbabwe, Suicide (a life sentence to those left behind); Nostalgia: Tara Hill, Co. Meath.
This entry was posted in Uncategorized and tagged , , , , . Bookmark the permalink.

Leave a comment