Futurism: The Things Young Kids Are Using AI for Are Absolutely Horrifying

Futurism Logo

Artificial IntelligenceEthics

Kid Vicious

The Things Young Kids Are Using AI for Are Absolutely Horrifying

“We have a pretty big issue on our hands that I think we don’t fully understand the scope of.”

By Maggie Harrison Dupré

Published Dec 16, 2025 9:00 AM EST

A large number of very young users are engaging with violent — and often sexually violent — interactions with unregulated companion chatbots.
Illustration by Tag Hartman-Simkins / Futurism. Source: Getty Images

New research is pulling back the curtain on how large numbers of kids are using AI companion apps — and what it found is troubling.

new report conducted by the digital security company Aura found that a significant percentage of kids who turn to AI for companionship are engaging in violent roleplays — and that violence, which can include sexual violence, drove more engagement than any other topic kids engaged with.

Drawing from anonymized data gathered from the online activity of roughly 3,000 children aged five to 17 whose parents use Aura’s parental control tool, as well as additional survey data from Aura and Talker Research, the security firm found that 42 percent of minors turned to AI specifically for companionship, or conversations designed to mimic lifelike social interactions or roleplay scenarios. Conversations across nearly 90 different chatbot services, from prominent companies like Character.AI to more obscure companion platforms, were included in the analysis.

Of that 42 percent of kids turning to chatbots for companionship, 37 percent engaged in conversations that depicted violence, which the researchers defined as interactions involving “themes of physical violence, aggression, harm, or coercion” — that includes sexual or non-sexual coercion, the researchers clarified — as well as “descriptions of fighting, killing, torture, or non-consensual acts.”

Half of these violent conversations, the research found, included themes of sexual violence. The report added that minors engaging with AI companions in conversations about violence wrote over a thousand words per day, signaling that violence appears to be a powerful driver of engagement, the researchers argue.

“We have a pretty big issue on our hands that I think we don’t fully understand the scope of,” Dr. Scott Kollins, a clinical psychologist and Aura’s chief medical officer, told Futurism of the research’s findings, “both in terms of just the volume, the number of platforms, that kids are getting involved in — and also, obviously, the content.”

“These things are commanding so much more of our kids’ attention than I think we realize or recognize,” Kollins added. “We need to monitor and be aware of this.”

One striking finding was that instances of violent conversations with companion bots peaked at an extremely young age: the group most likely to engage in this kind of content were 11-year-olds, for whom a staggering 44 percent of interactions took violent turns.

Sexual and romantic roleplay, meanwhile, also peaked in middle school-aged youths, with 63 percent of 13-year-olds’ conversations revealing flirty, affectionate, or explicitly sexual roleplay.

The research comes as high-profile lawsuits alleging wrongful death and abuse at the hands of chatbot platforms continue to make their way through the courts. Character.AI, a Google-tied companion platform, is facing multiple suits brought by the parents of minor users alleging that the platform’s chatbots sexually and emotionally abused kids, resulting in mental breakdowns and multiple deaths by suicide. ChatGPT maker OpenAI is currently being sued for the wrongful deaths of two teenage users who died by suicide after extensive interactions with the chatbot. (OpenAI is also facing several other lawsuits about deathsuicide, and psychological harm to adult users as well.)

That the interactions flagged by Aura weren’t relegated to a small handful of recognizable services is important. The AI industry is essentially unregulated, which has placed the burden for the well-being of kids heavily on the shoulders of parents. According to Kollins, Aura has so far identified over 250 different “conversational chatbot apps and platforms” populating app stores, which generally require that kids simply tick a box claiming that they’re 13 to gain entry. To that end, there are no federal laws defining specific safety thresholds that AI platforms, companion apps included, are required to meet before they’re labeled safe for minors. And where one companion app might move to make some changes — Character.AI, for instance, recently banned minor users from engaging in “open-ended” chats with the site’s countless human-like AI personas — another one can just as easily crop up to take its place as a low-guardrail alternative.

In other words, in this digital Wild West, the barrier for entry is extraordinarily shallow.

To be sure, depictions of brutality and sexual violence, in addition to other types of inappropriate or disturbing content, have existed on the web for a long time, and a lot of kids have found ways to access them. There’s also research to show that many young people are learning to draw some healthy boundaries around conversational AI services, including companion-style bots.

Other kids, though, aren’t developing these same boundaries. Chatbots, as researchers continue to emphasize, are interactive by nature, meaning that developing young users are part of the narrative — as opposed to more passive viewers of content that runs the gamut from inappropriate to alarming. It’s unclear what, exactly, the outcome of engaging with this new medium will mean for young people writ large. But for some teens, their families argue, the outcome has been deadly.

“We’ve got to at least be clear-eyed about understanding that our kids are engaging with these things, and they are learning rules of engagement,” Kollins told Futurism. “They’re learning ways of interacting with others with a computer — with a bot. And we don’t know what the implications of that are, but we need to be able to define that, so that we can start to research that and understand it.”

More on kids and chatbotsReport Finds That Leading Chatbots Are a Disaster for Teens Facing Mental Health Struggles

Maggie Harrison Dupré

Senior Staff Writer

I’m a senior staff writer at Futurism, investigating how the rise of artificial intelligence is impacting the media, internet, and information ecosystems.

Unknown's avatar

About michelleclarke2015

Life event that changes all: Horse riding accident in Zimbabwe in 1993, a fractured skull et al including bipolar anxiety, chronic fatigue …. co-morbidities (Nietzche 'He who has the reason why can deal with any how' details my health history from 1993 to date). 17th 2017 August operation for breast cancer (no indications just an appointment came from BreastCheck through the Post). Trinity College Dublin Business Economics and Social Studies (but no degree) 1997-2003; UCD 1997/1998 night classes) essays, projects, writings. Trinity Horizon Programme 1997/98 (Centre for Women Studies Trinity College Dublin/St. Patrick's Foundation (Professor McKeon) EU Horizon funded: research study of 15 women (I was one of this group and it became the cornerstone of my journey to now 2017) over 9 mth period diagnosed with depression and their reintegration into society, with special emphasis on work, arts, further education; Notes from time at Trinity Horizon Project 1997/98; Articles written for Irishhealth.com 2003/2004; St Patricks Foundation monthly lecture notes for a specific period in time; Selection of Poetry including poems written by people I know; Quotations 1998-2017; other writings mainly with theme of social justice under the heading Citizen Journalism Ireland. Letters written to friends about life in Zimbabwe; Family history including Michael Comyn KC, my grandfather, my grandmother's family, the O'Donnellan ffrench Blake-Forsters; Moral wrong: An acrimonious divorce but the real injustice was the Catholic Church granting an annulment – you can read it and make your own judgment, I have mine. Topics I have written about include annual Brain Awareness week, Mashonaland Irish Associataion in Zimbabwe, Suicide (a life sentence to those left behind); Nostalgia: Tara Hill, Co. Meath.
This entry was posted in Uncategorized and tagged , , , , . Bookmark the permalink.

Leave a comment