The Deep View: Data … the liquid Gold. “Meta’s monetization of AI companionship”

Meta’s monetization of AI companionship
Source: ChatGPT 4o Image Generation
Mark Zuckerberg envisions a future where AI chatbots become your friends – a proposal he touts as a cure for the “loneliness epidemic.” In a recent interview, the Meta CEO pointed out that “the average American… has fewer than three friends” and that “the average person has demand for meaningfully more… like 15 friends or something”.
Zuckerberg’s grand vision of AI companionship comes on the heels of Meta’s pivot from its last big social experiment, the “metaverse”, where in 2022, Facebook became Meta Platforms, betting that virtual worlds would become the next frontier of interaction. Meta staked billions on VR, and bought Oculus for $2 billion back in 2014. Yet Reality Labs has now racked up $60 billion in losses, and Zuckerberg’s January promise that 2025 would be the metaverse’s “pivotal year” vanished from Meta’s latest call, replaced by five AI talking points.
Zuckerberg’s recent roadshow, where he appeared on various podcasts, developer conferences, and interviews with other tech leaders, made his position clear: most people crave more connection than they currently have, and they’re not getting it from other people. “I think people are going to want a system that knows them well – that understands them the way their feed algorithms do,”Mark Zuckerberg, during an onstage interview with Stripe co-founder John Collison at Stripe’s annual conference. That system now exists: Meta AI, launched May 1, 2025, on iOS, Android, and the web, powered by Meta’s Llama 3 model.
What’s next: Meta has been aggressively pushing out new AI features across Instagram, Facebook, and WhatsApp. 
They rolled out a new Meta AI Assistant for Facebook Messenger and WhatsApp, “an assistant that gets to know your preferences”. Meta AI positions itself as a direct ChatGPT competitor.  At first glance, it offers what you’d expect from an AI assistant: you can type or talk to it, ask questions, generate images in real-time, and even get web results. But Meta’s twist is to make the experience social. The app includes a “Discover” feed that displays conversations other users have had with the chatbot – posts that friends opt to share, turning private AI musings into content for others to like and comment on. 
The bigger picture: It’s a compelling sell, particularly after a pandemic that underscored how many people feel alone (The U.S. Surgeon General even declared a “loneliness epidemic” last year). Little wonder Meta isn’t alone in this arena – Snap has given its millions of teen users a chatbot “friend” by default, and countless startups offer AI companions for everything from romance to therapy. 
Between the lines: This isn’t altruism. Instead, it looks like a business model designed to monetize emotional need. The more time users spend confiding in their chatbot, the more data Meta gathers – emotional patterns, preferences, vulnerabilities – all of which can feed its advertising engine. Just like social media engagement, emotional engagement becomes a metric. And as with apps like Replika, a premium tier is almost inevitable: want deeper conversations, personalized memories, or role-play features? That’ll cost extra. Loneliness isn’t just a problem to solve — it’s a growth market.
As Meta rolls out these increasingly realistic AI personas the risks for teenage users surface. Startups like Character.ai followed this path, raising hundreds of millions in VC funding to give users “personalized AI for every moment of your day”. The result: lawsuits after young users have seen mental health declines.
Emotionally responsive chatbots without clear age filters or usage boundaries opens the door to unintended consequences. Lawmakers have called for stronger safeguards, warning that AI systems designed to simulate friendship and emotional support shouldn’t be left to evolve in the wild, especially on platforms widely used by children. Meta has said it’s working on moderation tools, but for many, it raises a broader question: should AI companions be marketed to everyone, or do some groups need extra protection?
Meta’s AI friendship vision, if realized, could entrench isolation rather than cure it, all while harvesting our emotions for profit. Social media was supposed to connect us, yet here we are, being steered toward simulated companionship. Not a triumph of innovation, but a concession that we can’t organize ourselves to care for each other, leaving it to Big Tech to fill the void. Simulated companionship isn’t connection; it’s a business model. And we’ll all pay for it, one confession at a time.
Unknown's avatar

About michelleclarke2015

Life event that changes all: Horse riding accident in Zimbabwe in 1993, a fractured skull et al including bipolar anxiety, chronic fatigue …. co-morbidities (Nietzche 'He who has the reason why can deal with any how' details my health history from 1993 to date). 17th 2017 August operation for breast cancer (no indications just an appointment came from BreastCheck through the Post). Trinity College Dublin Business Economics and Social Studies (but no degree) 1997-2003; UCD 1997/1998 night classes) essays, projects, writings. Trinity Horizon Programme 1997/98 (Centre for Women Studies Trinity College Dublin/St. Patrick's Foundation (Professor McKeon) EU Horizon funded: research study of 15 women (I was one of this group and it became the cornerstone of my journey to now 2017) over 9 mth period diagnosed with depression and their reintegration into society, with special emphasis on work, arts, further education; Notes from time at Trinity Horizon Project 1997/98; Articles written for Irishhealth.com 2003/2004; St Patricks Foundation monthly lecture notes for a specific period in time; Selection of Poetry including poems written by people I know; Quotations 1998-2017; other writings mainly with theme of social justice under the heading Citizen Journalism Ireland. Letters written to friends about life in Zimbabwe; Family history including Michael Comyn KC, my grandfather, my grandmother's family, the O'Donnellan ffrench Blake-Forsters; Moral wrong: An acrimonious divorce but the real injustice was the Catholic Church granting an annulment – you can read it and make your own judgment, I have mine. Topics I have written about include annual Brain Awareness week, Mashonaland Irish Associataion in Zimbabwe, Suicide (a life sentence to those left behind); Nostalgia: Tara Hill, Co. Meath.
This entry was posted in Uncategorized and tagged , , , , . Bookmark the permalink.

Leave a comment