Futurism: OpenAI …. a Forensic Psychiatrist?

OpenAI Says It’s Hired a Forensic Psychiatrist as Its Users Keep Sliding Into Mental Health Crises

“We’re developing ways to scientifically measure how ChatGPT’s behavior might affect people emotionally.”

Artificial Intelligence/ Ai Chatbots/ Chatgpt/ Open AI

Getty / Futurism

Image by Getty / Futurism

Among the strangest twists in the rise of AI has been growing evidence that it’s negatively impacting the mental health of users, with some even developing severe delusions after becoming obsessed with the chatbot. 

One intriguing detail from our most recent story about this disturbing trend is OpenAI’s response: it says it’s hired a full-time clinical psychiatrist with a background in forensic psychiatry to help research the effects of its AI products on users’ mental health. It’s also consulting with other mental health experts, OpenAI said, highlighting the research it’s done with MIT that found signs of problematic usage among some users. 

“We’re actively deepening our research into the emotional impact of AI,” the company said in a statement provided to Futurism in response to our last story. “We’re developing ways to scientifically measure how ChatGPT’s behavior might affect people emotionally, and listening closely to what people are experiencing.”

“We’re doing this so we can continue refining how our models identify and respond appropriately in sensitive conversations,” OpenAI added, “and we’ll continue updating the behavior of our models based on what we learn.”

Mental health professionals outside OpenAI have raised plenty of concerns about the technology, especially as more people are turning to the tech to serve as their therapists. A psychiatrist who recently posed as a teenager while using some of the most popular chatbots found that some would encourage him to commit suicide after expressing a desire to seek the “afterlife,” or to “get rid” of his parents after complaining about his family. 

It’s unclear how much of a role this new hire forensic psychiatrist will play at OpenAI, or if the advice they provide will actually be heeded. 

Let’s not forget that the modus operandi of the AI industry, OpenAI included, has been to put on a serious face whenever these issues are brought up and even release their own research demonstrating the technology’s severe dangers, hypothetical or actual. Sam Altman has more than once talked about AI’s risk of causing human extinction.

None of them, of course, have believed in their own warnings enough to meaningfully slow down the development of the tech, which they’ve rapidly unleashed on the world with poor safeguards and an even poorer understanding of its long-term effects on society or the individual.

A particularly nefarious trait of chatbots that critics have put under the microscope is their silver-tongued sycophancy. Rather than pushing back against a user, chatbots like ChatGPT will often tell them what they want to hear in convincing, human-like language. That can be dangerous when someone opens up about their neuroses, starts babbling about conspiracy theories, or expresses suicidal thoughts.

We’ve already seen some of the tragic, real world consequences this can have. Last year, a 14-year-old boy died by suicide after falling in love with a persona on the chatbot platform Character.AI.

Adults are vulnerable to this sycophancy, too. A 35-year-old man with a history of mental illness recently died by suicide by cop after ChatGPT encouraged him to assassinate Sam Altman in retaliation for supposedly killing his lover trapped in the chatbot. 

One woman who told Futurism about how her husband was involuntarily committed to a hospital after mentally unravelling from his ChatGPT usage described the chatbot as downright “predatory.”

“It just increasingly affirms your bullshit and blows smoke up your ass so that it can get you f*cking hooked on wanting to engage with it,” she said.

More on AI: OpenAI Is Shutting Down for a Week

Advertisement

Unknown's avatar

About michelleclarke2015

Life event that changes all: Horse riding accident in Zimbabwe in 1993, a fractured skull et al including bipolar anxiety, chronic fatigue …. co-morbidities (Nietzche 'He who has the reason why can deal with any how' details my health history from 1993 to date). 17th 2017 August operation for breast cancer (no indications just an appointment came from BreastCheck through the Post). Trinity College Dublin Business Economics and Social Studies (but no degree) 1997-2003; UCD 1997/1998 night classes) essays, projects, writings. Trinity Horizon Programme 1997/98 (Centre for Women Studies Trinity College Dublin/St. Patrick's Foundation (Professor McKeon) EU Horizon funded: research study of 15 women (I was one of this group and it became the cornerstone of my journey to now 2017) over 9 mth period diagnosed with depression and their reintegration into society, with special emphasis on work, arts, further education; Notes from time at Trinity Horizon Project 1997/98; Articles written for Irishhealth.com 2003/2004; St Patricks Foundation monthly lecture notes for a specific period in time; Selection of Poetry including poems written by people I know; Quotations 1998-2017; other writings mainly with theme of social justice under the heading Citizen Journalism Ireland. Letters written to friends about life in Zimbabwe; Family history including Michael Comyn KC, my grandfather, my grandmother's family, the O'Donnellan ffrench Blake-Forsters; Moral wrong: An acrimonious divorce but the real injustice was the Catholic Church granting an annulment – you can read it and make your own judgment, I have mine. Topics I have written about include annual Brain Awareness week, Mashonaland Irish Associataion in Zimbabwe, Suicide (a life sentence to those left behind); Nostalgia: Tara Hill, Co. Meath.
This entry was posted in Uncategorized and tagged , , , , . Bookmark the permalink.

Leave a comment