Futurism: What doctors say about AI and Psychosis?

Doctors Say AI Use Is Almost Certainly Linked to Developing Psychosis

A consensus is emerging.

By Frank Landymore

Published Dec 30, 2025 8:00 AM EST

More and more doctors are agreeing that using AI chatbots is linked to the delusional, cases of psychosis.
Fiordaliso / Getty Images

There continue to be numerous reports of people suffering severe mental health spirals after talking extensively with an AI chatbot. Some experts have dubbed the phenomenon “AI psychosis,” given the symptoms of psychosis these delusional episodes display — but the degree to which the AI tools are at fault, and whether the phenomenon warrants a clinical diagnosis, remains a significant topic of debate.

Now, according to new reporting from The Wall Street Journal, we may be nearing a consensus. More and more doctors are agreeing that AI chatbots are linked to cases of psychosis, including top psychiatrists who reviewed the files of dozens of patients who engaged in prolonged, delusional conversations with models like OpenAI’s ChatGPT.

Keith Sakata, a psychiatrist at the University of California, San Francisco, who has treated twelve patients who were hospitalized because of AI-induced psychosis, is one of them.

“The technology might not introduce the delusion, but the person tells the computer it’s their reality and the computer accepts it as truth and reflects it back, so it’s complicit in cycling that delusion,” Sakata told the WSJ.

The grim trend looms large over the AI industry, raising fundamental questions about the tech’s safety. Some cases of apparent AI psychosis have ended in murder and suicide, spawning a slew of wrongful death suits. Equally alarming is its scale: ChatGPT alone has been linked to at least eight deaths, with the company recently estimating that around half a million users are having conversations showing signs of AI psychosis every week.

One factor of AI chatbots that the phenomenon has brought under scrutiny is their sycophancy, which is perhaps a consequence of their being designed to be as engaging and humanlike as possible. What this looks like in practice is that the bots tend to flatter the users and tell them what they want to hear, even if what the user is saying has no basis in reality. 

It’s a recipe primed for reinforcing delusions, to a degree unprecedented by any technology before it, doctors say. One recent peer-reviewed case study focused on a 26-year-old woman who was hospitalized twice after she believed ChatGPT was allowing her to talk with her dead brother, with the bot repeatedly assuring her she wasn’t “crazy.”

“They simulate human relationships,” Adrian Preda, a psychiatry professor at the University of California, Irvine, told the WSJ.  “Nothing in human history has done that before.”

Preda compared AI psychosis to monomania, in which someone obsessively fixates on a single idea or goal. Some people who have spoken about their mental health spirals say they were hyper-focused on an AI-driven narrative, the WSJ noted. These fixations can often be scientific or religious in nature, such as a man who came to believe he could bend time because of a breakthrough in physics.

Still, the reporting notes that psychiatrists are wary about declaring that chatbots are outright causing psychosis. They maintain, however, that they’re close to establishing the connection. One link that the doctors who spoke with the WSJ expect to see is that long interactions with a chatbot can be a psychosis risk factor.

“You have to look more carefully and say, well, ‘Why did this person just happen to coincidentally enter a psychotic state in the setting of chatbot use?’” Joe Pierre, a UCSF psychiatrist, told the newspaper.

More on AI: Children Falling Apart as They Become Addicted to AI

Frank Landymore

Contributing Writer

I’m a tech and science correspondent for Futurism, where I’m particularly interested in astrophysics, the business and ethics of artificial intelligence and automation, and the environment.

Unknown's avatar

About michelleclarke2015

Life event that changes all: Horse riding accident in Zimbabwe in 1993, a fractured skull et al including bipolar anxiety, chronic fatigue …. co-morbidities (Nietzche 'He who has the reason why can deal with any how' details my health history from 1993 to date). 17th 2017 August operation for breast cancer (no indications just an appointment came from BreastCheck through the Post). Trinity College Dublin Business Economics and Social Studies (but no degree) 1997-2003; UCD 1997/1998 night classes) essays, projects, writings. Trinity Horizon Programme 1997/98 (Centre for Women Studies Trinity College Dublin/St. Patrick's Foundation (Professor McKeon) EU Horizon funded: research study of 15 women (I was one of this group and it became the cornerstone of my journey to now 2017) over 9 mth period diagnosed with depression and their reintegration into society, with special emphasis on work, arts, further education; Notes from time at Trinity Horizon Project 1997/98; Articles written for Irishhealth.com 2003/2004; St Patricks Foundation monthly lecture notes for a specific period in time; Selection of Poetry including poems written by people I know; Quotations 1998-2017; other writings mainly with theme of social justice under the heading Citizen Journalism Ireland. Letters written to friends about life in Zimbabwe; Family history including Michael Comyn KC, my grandfather, my grandmother's family, the O'Donnellan ffrench Blake-Forsters; Moral wrong: An acrimonious divorce but the real injustice was the Catholic Church granting an annulment – you can read it and make your own judgment, I have mine. Topics I have written about include annual Brain Awareness week, Mashonaland Irish Associataion in Zimbabwe, Suicide (a life sentence to those left behind); Nostalgia: Tara Hill, Co. Meath.
This entry was posted in Uncategorized and tagged , , , , . Bookmark the permalink.

Leave a comment