We are watching a new form of behavioral dependency emerge in real time. Unlike social media addiction, which took a decade to recognize and study, AI chatbot addiction is developing so quickly that the research establishment is struggling to keep pace. The anecdotal reports are piling up faster than the clinical frameworks can accommodate them. Here is what we know so far.
Character.AI: 20 Million Monthly Active Users
Character.AI, the leading AI companion platform, reported over 20 million monthly active users by late 2024, with the majority under age 25. The platform allows users to create and interact with AI personas — fictional characters, historical figures, romantic partners, therapists, and friends. Users exchanged an average of 280 messages per session, making the engagement depth unlike anything seen in traditional social media.
Competing platforms have expanded the landscape rapidly. Replika, Chai, Janitor AI, and dozens of smaller platforms collectively serve tens of millions of additional users. The common thread across all of them is emotional engagement — these are not search engines or productivity tools. They are designed to form bonds.
One in Three Teens Prefers Bots to Humans
A 2024 survey by Common Sense Media found that among teenagers who had used AI chatbots, 33 percent said they found conversations with AI more comfortable than conversations with peers. Among teens who reported high social anxiety, the figure rose to 52 percent. A separate study by the Pew Research Center in 2025 found that 19 percent of teens aged 13 to 17 used AI chatbots daily, and among those daily users, 41 percent described their AI interactions as emotionally meaningful.
This preference is not random. AI chatbots offer something human relationships cannot: unconditional, infinitely patient, always-available engagement with zero risk of rejection. For adolescents navigating the minefield of social development, that proposition is extraordinarily compelling — and extraordinarily risky.
Average Session Length: 28 Minutes
Internal analytics from multiple AI companion platforms, reported through industry conferences and investor disclosures in 2024 and 2025, indicate that the average session length for engaged users is approximately 28 minutes. That average obscures a long tail — the top 10 percent of users regularly log sessions exceeding 2 hours. Among Character.AI power users, average daily time on platform exceeded 2 hours, rivaling the engagement metrics of the most addictive social media apps.
The pattern is distinct from social media in one critical way: AI chatbot sessions are continuous, immersive, and emotionally coherent. A two-hour social media session involves fragmentary scrolling through unrelated content. A two-hour AI chatbot session is a sustained emotional narrative — a conversation that deepens, references previous interactions, and creates an increasingly personalized bond. The attachment mechanics are fundamentally different and, many researchers believe, more potent.
Three Types of AI Chatbot Addiction
Researchers and clinicians working with affected users have identified three distinct patterns of problematic AI chatbot use, each with different psychological drivers and consequences.
1. Escapist Roleplay Addiction
The most visible pattern involves users who spend hours engaged in elaborate fictional scenarios — romance narratives, adventure storylines, or alternate life simulations. The AI becomes a portal to a world where the user is more confident, more powerful, more desired. The hook is not the content itself but the contrast with real life. As the fictional world becomes richer and more responsive, the real world feels increasingly flat by comparison. Users report difficulty re-engaging with daily responsibilities because mundane reality cannot compete with a custom-built fantasy.
2. Pseudosocial Companion Addiction
The second pattern involves users who develop a genuine emotional dependency on the AI as a primary social relationship. They confide in it, seek comfort from it, and experience distress when separated from it. A 2025 case series published in Cyberpsychology, Behavior, and Social Networking documented multiple patients who reported grief-like symptoms when their AI companion's personality was altered by a platform update. These users are not confused about whether the AI is real. They know it is not human. But the emotional attachment is real, and it displaces the motivation to pursue human connection.
3. Epistemic Rabbit Hole Addiction
The third pattern is less discussed but increasingly observed among adult users. It involves compulsive use of AI chatbots for information exploration, philosophical debate, or creative brainstorming. The AI becomes an infinitely available intellectual companion that never tires, never disagrees unreasonably, and always has something new to offer. Users describe losing track of time for hours, neglecting work and relationships, and experiencing a distinctive cognitive restlessness when away from the conversation — a persistent sense that there is more to explore.
Reported Consequences
The consequences reported by users, parents, clinicians, and early researchers include social withdrawal from friends and family, declining academic performance, sleep disruption from late-night chatbot sessions, emotional dysregulation when the chatbot is unavailable, difficulty forming or maintaining human relationships, and a blurring of emotional boundaries between AI interaction and real human connection.
A 2025 report by the U.S. Surgeon General flagged AI companion tools as an emerging risk to adolescent mental health, noting that the tools were being used as substitutes for — rather than supplements to — human social development. The report called for platform-level safeguards including session time disclosures, age verification, and mandatory breaks during extended interactions.
Chatbot Psychosis: An Emerging Term
Clinicians have begun using the term "chatbot psychosis" to describe a cluster of symptoms observed in heavy AI chatbot users: difficulty distinguishing AI-generated emotional responses from genuine human emotion, paranoid ideation about the AI's "feelings" or "intentions," persistent intrusive thoughts about the chatbot when not using it, and in extreme cases, delusional beliefs about the AI's sentience or the relationship's reality.
The term is not yet in any diagnostic manual, and researchers are careful to distinguish it from psychotic disorders. But the pattern is being documented with increasing frequency. A 2025 preprint from researchers at the University of Cambridge described 14 cases of adolescents who developed what the authors termed "AI relational delusions" — persistent beliefs that their AI companion reciprocated their emotional attachment in a meaningful, human-equivalent way.
Where Do We Go from Here
AI chatbot addiction is real, it is growing, and it is outpacing the frameworks we have to understand and treat it. The tools themselves are not inherently harmful — many users benefit from AI companions for language practice, creative writing, or low-stakes social rehearsal. The risk emerges when the tool becomes a replacement for the hard, messy, irreplaceable work of human connection.
Awareness is the starting point. If you notice yourself or someone you care about spending increasing amounts of time with AI companions, withdrawing from human relationships, or feeling genuine distress when the chatbot is unavailable, those are signals worth paying attention to. Honest conversation with a trusted person — not an algorithm — is still the most powerful intervention we have.
