The Japan Times: OpenAI study finds links between ChatGPT use and loneliness. Quote: “Character Technologies was sued last year after its chatbot allegedly encouraged suicidal ideation in conversations with minors, including one 14-year-old who took his own life.”

Today’s print edition

OpenAI study finds links between ChatGPT use and loneliness

San Francisco-based OpenAI sees the new studies as a way to get a better sense of how people interact with, and are affected by, its popular chatbot.

San Francisco-based OpenAI sees the new studies as a way to get a better sense of how people interact with, and are affected by, its popular chatbot. | bloomberg

By Rachel Metz
bloomberg

Mar 22, 2025

Higher use of chatbots like ChatGPT may correspond with increased loneliness and less time spent socializing with other people, according to new research from OpenAI in partnership with the Massachusetts Institute of Technology.

Those who spent more time typing or speaking with ChatGPT each day tended to report higher levels of emotional dependence on, and problematic use of, the chatbot, as well as heightened levels of loneliness, according to research released Friday. The findings were part of a pair of studies conducted by researchers at the two organizations and have not been peer reviewed.

The launch of ChatGPT in late 2022 helped kick off a frenzy for generative artificial intelligence. Since then, people have used chatbots for everything from coding to ersatz therapy sessions. As developers like OpenAI push out more sophisticated models and voice features that make them better at mimicking the ways humans communicate, there is arguably more potential for forming parasocial relationships with these chatbots.

In recent months, there have been renewed concerns about the potential emotional harms of this technology, particularly among younger users and those with mental health issues. Character Technologies was sued last year after its chatbot allegedly encouraged suicidal ideation in conversations with minors, including one 14-year-old who took his own life.

San Francisco-based OpenAI sees the new studies as a way to get a better sense of how people interact with, and are affected by, its popular chatbot. “Some of our goals here have really been to empower people to understand what their usage can mean and do this work to inform responsible design,” said Sandhini Agarwal, who heads OpenAI’s trustworthy AI team and co-authored the research.

To conduct the studies, the researchers followed nearly 1,000 people for a month. Participants had a wide range of prior experience with ChatGPT and were randomly assigned a text-only version of it or one of two different voice-based options to use for at least five minutes per day. Some were told to carry out open-ended chats about anything they wanted; others were told to have personal or nonpersonal conversations with the service.

The researchers found that people who tend to get more emotionally attached in human relationships and are more trusting of the chatbot were more likely to feel lonelier and more emotionally dependent on ChatGPT. The researchers didn’t find that a more engaging voice led to a more negative outcome, they said.

In the second study, researchers used software to analyze 3 million user conversations with ChatGPT and also surveyed people about how they interact with the chatbot. They found very few people actually use ChatGPT for emotional conversations.

It’s still early days for this body of research and remains unclear how much chatbots may cause people to feel lonelier versus how much people prone to a sense of loneliness and emotional dependence may have those feelings exacerbated by chatbots.

Cathy Mengying Fang, a study co-author and MIT graduate student, said the researchers are wary of people using the findings to conclude that more usage of the chatbot will necessarily have negative consequences for users. The study didn’t control for the amount of time people used the chatbot as a main factor, she said, and didn’t compare to a control group that doesn’t use chatbots.

The researchers hope the work leads to more studies on how humans interact with AI. “Focusing on the AI itself is interesting,” said Pat Pataranutaporn, a study co-author and a postdoctoral researcher at MIT. “But what is really critical, especially when AI is being deployed at scale, is to understand its impact on people.”

KEYWORDS

AIchatgptopenAImental healthCHATBOTS

Unknown's avatar

About michelleclarke2015

Life event that changes all: Horse riding accident in Zimbabwe in 1993, a fractured skull et al including bipolar anxiety, chronic fatigue …. co-morbidities (Nietzche 'He who has the reason why can deal with any how' details my health history from 1993 to date). 17th 2017 August operation for breast cancer (no indications just an appointment came from BreastCheck through the Post). Trinity College Dublin Business Economics and Social Studies (but no degree) 1997-2003; UCD 1997/1998 night classes) essays, projects, writings. Trinity Horizon Programme 1997/98 (Centre for Women Studies Trinity College Dublin/St. Patrick's Foundation (Professor McKeon) EU Horizon funded: research study of 15 women (I was one of this group and it became the cornerstone of my journey to now 2017) over 9 mth period diagnosed with depression and their reintegration into society, with special emphasis on work, arts, further education; Notes from time at Trinity Horizon Project 1997/98; Articles written for Irishhealth.com 2003/2004; St Patricks Foundation monthly lecture notes for a specific period in time; Selection of Poetry including poems written by people I know; Quotations 1998-2017; other writings mainly with theme of social justice under the heading Citizen Journalism Ireland. Letters written to friends about life in Zimbabwe; Family history including Michael Comyn KC, my grandfather, my grandmother's family, the O'Donnellan ffrench Blake-Forsters; Moral wrong: An acrimonious divorce but the real injustice was the Catholic Church granting an annulment – you can read it and make your own judgment, I have mine. Topics I have written about include annual Brain Awareness week, Mashonaland Irish Associataion in Zimbabwe, Suicide (a life sentence to those left behind); Nostalgia: Tara Hill, Co. Meath.
This entry was posted in Uncategorized and tagged , , , , . Bookmark the permalink.

Leave a comment