Is your teen using a chatbot for companionship? If you don’t know, you might want to ask. Common Sense Media released a study on Wednesday, in which it found that more than half of pre-adult teenagers regularly use AI companions. Nearly a third of the teens reported that conversations with AI were as satisfying as conversations with actual humans, if not more so.
Researchers also found that 33% of teens use AI companions such as Character.AI, Nomi and Replika «for social interaction and relationships, including conversation practice, emotional support, role-playing, friendship or romantic interactions.» The study distinguished between anthropomorphic AI bots and more assistance-oriented AI tools such as ChatGPT, Microsoft Copilot or Google’s Gemini.
Considering the growing widespread use of AI companions in teens, the Common Sense Media researchers concluded that their findings supported limiting the use of AI among young people. «Our earlier recommendation stands: Given the current state of AI platforms, no one younger than 18 should use AI companions,» the team said, after surveying 1,060 teens aged 13 to 17 from across the US over the past year.
For the past few years, generative AI has evolved at lightning speed, with new tools regularly available across the world and disrupting business models, social practices and cultural norms. This, combined with an epidemic of social isolation exacerbated by the COVID-19 pandemic, puts teens at risk from technology that their young brains might not be able to handle adequately.
Why experts are worried about teens and AI
Amid the growing use of chatbots by people to discuss personal problems and get advice, it’s important to remember that, while they might seem confident and reassuring, they’re not mental health professionals.
A.G. Noble, a mental health therapist specializing in adolescents at Youth Eastside Services in Bellevue, Washington, says she isn’t surprised by the Common Sense Media study. She pointed to a growing number of adolescents struggling with social skills and with feeling connected to their peers, which she calls a «perfect recipe for loneliness.»
«What AI companions offer are low-risk ‘social’ interaction: privacy, no bullying, no worries about the awkwardness of ghosting the AI companion if the kids don’t want to talk anymore,» Noble said. «And I think everyone can empathize — who wouldn’t want a ‘social relationship’ without the minefield, especially in their teens?»
Debbi Halela, director of behavioral health services at Youth Eastside Services, says teens need to interact with humans in real life, especially in the aftermath of the pandemic of 2020.
«Overreliance on technology runs the risk of hindering the healthy development of social skills in young people,» Halela said. «Youth are also still developing the ability to make decisions and think critically, therefore they may be vulnerable to manipulation and influence from information sources that are not always reliable, and this could inhibit the development of critical thinking skills.»
The American Psychological Association warned earlier this year that «we have already seen instances where adolescents developed unhealthy and even dangerous ‘relationships’ with chatbots.» The APA issued several recommendations, including teaching AI literacy to kids and AI developers creating systems that regularly remind teen users that AI companions are not actual humans.
Noble says virtual interactions «can trigger the dopamine and oxytocin responses of a real social interaction — but without the resulting social bond. Like empty calories coming from diet soda, it seems great in the moment but ultimately doesn’t nourish.»
Parents need to encourage real-world activities that involve teens with other people, Noble said. «Real social interaction is the best buffer against the negative impacts of empty AI interactions.»