Relationships are messy, whether you are an adult with lots of experience or a kid navigating tough times with a best friend, boyfriend or girlfriend. You can’t predict moods, interests or desires. For teens learning the ins and outs of relationships for the first time, disagreements, fights and breakups can be crushing.
But what if your teen’s best friend wasn’t actually human? It may seem far-fetched, but it’s not. A new report from Common Sense Media says that 72 percent of teens surveyed have used AI companions, and 33 percent have relationships or friendships with these chatbots.
The language that AI companions use, the responses they make, and the empathy they exude can make a user feel as though they truly understand and sympathize. These chatbots can make someone feel liked or even loved. They are programmed to help users feel like they’ve made a real connection. And as adolescents have a naturally developing fascination with romance and sexuality, if you feel ignored by the girls in your high school, well, now, on the nearest screen is a hot girlfriend who is constantly fascinated by you and your video games, or a super cute boyfriend whom you never had to engage in small talk with to form a bond.
On supporting science journalism
If you’re enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.
This may be perplexing to some parents, but if your child is navigating the complex worlds of technology, social media and artificial intelligence, the likelihood they will be curious about an AI companion is pretty high. Here’s what you need to know to help them.
Chatbots have been around for a long time. In 1966 an MIT professor named Joseph Weizenbaum created the first chatbot, named ELIZA. Today AI and natural language processing have sprinted far past ELIZA. You probably have heard of ChatGPT. But some of the common companion AI platforms are ones you might not be familiar with: Replika, Character.AI and My AI are just a few. In 2024 Mozilla counted more than 100 million downloads of a group of chatbot apps. Some apps set 18 as a minimum age requirement, but it’s easy for a younger teen to get around that.
You might think your kid won’t get attached, that they will know this chatbot is an algorithm designed to give responses based on the text inputs they receive; that it’s not “real.” But a fascinating Stanford University study of students who use the app Replika found that 81 percent considered their AI companion to have “intelligence,” and 90 percent thought it “human-like.”
On the plus side, these companions are sometimes touted for their supportiveness and promotion of mental health; the Stanford study even found that 3 percent of users felt their Replika had directly helped them avoid suicide. If you’re a teenager who is marginalized, isolated or struggling to make friends, an AI companion can provide much-needed companionship. They may offer practice when it comes to building conversational and social skills. Chatbots can offer helpful information and tips.
But are they safe?
A Florida mother has sued the company that owns Character.AI, alleging the chatbot formed an obsessive relationship with her 14-year-old son, Sewell Setzer III, and ultimately encouraged him to attempt suicide (which he tragically completed). Another suit filed in 2024 alleges that the same chatbot encourages self-harm in teens and violence towards parents who try to set limits on how often kids use the app.
Then there’s privacy: Wired, drawing on Mozilla’s research, labeled AI companions a “privacy nightmare,” many crawling with data trackers that might manipulate users into thinking a chatbot is their soulmate, encouraging negative or harmful behaviors.
Given what we know about teens, screens and mental health, online influences are sometimes powerful, largely unavoidable, and potentially life-changing for children and families.
So what do you do?
Remind kids that human friends offer so much that AI companions don’t. IRL friendships are challenging, and this is a good thing. Remind them that in their younger years, play is how they learned new skills; if they didn’t know how to put LEGOs together, they learned with a new friend. If they struggled with collaboration and cooperation, play taught them how to take turns, and how to adjust based on their playmates’ responses.
Friends give children practice with the ins and outs of relationships. A friend can be tired, crabby or overexcited. They might be lots of fun, but also easily frustrated; or maybe they’re sometimes boring, but very loyal. Growing up, a child has to learn how to take into account their friend’s personality and quirks, and they have to learn how to keep the friendship going. Maybe most poignantly, they learn how incredibly valuable friends are when things get tough. In cases of social stress, like bullying, the support of a friend who sticks by you is priceless. In my study of more than 1,000 teenagers in 2020, keeping close to a friend was by far the most helpful strategy for kids who said they were the targets of bullies. Another study of more than 1,000 teens found that IRL friends can lessen the effects of problematic social media use.
If they are curious about AI companions, educate them. This can increase their skepticism and awareness about these programs and why they exist (and why they’re often free). It’s important to acknowledge the pluses as well as the minuses of digital companionship. AI companions can be very supportive; they’re never fuming on the school bus because their mother made them wear a sweater on a cold morning, they’re never jealous when you have a new girlfriend, and they never accuse you of ignoring their needs. But they won’t teach you how to handle things when they drop you for a new best friend, or when they develop an interest that you just can’t share. Discussing profit motives, personal security risks and social or emotional risks doesn’t guarantee that a teenager won’t go online and get an AI girlfriend; but it will at least plant the seeds of a healthy doubt.
It may be important to identify high-risk kids who already struggle with social skills or making friends, and who may be particularly vulnerable to toxic AI companions. In a world populated by children with generally depleted social skills, eliminating the complex, sometimes awkward, human factor can feel like a great advantage, at least in the short term. In a preliminary analysis of 1,983 teens in three states, I found that of the kids who made romantic connections online, 50 percent said they liked that approach because it eliminated the need for meeting, talking and all the other awkward “stuff” you have to do in person with someone.
That said, most teens don’t report having any serious problems or outcomes from their online activities. In a preliminary analysis of a 2022 study that I recently presented at a conference, only 3 percent of 642 older teens from Colorado, Massachusetts, and Virginia reported that they had ever had a significant (i.e., non-minor) online problem. We hear about online problems so frequently that we tend to assume they’re common; but that doesn’t appear to be the case. I don’t think it’s inevitable that human friendships will be uniformly abandoned for AI ones, resulting in catastrophic loneliness and loss of online privacy.
Finally, keep the conversations going, and don’t feel like you need to know everything. In a 2015 study, I found that fully two thirds of the teenagers whose parents discussed digital behaviors reported that their parents’ opinions and thoughts were quite helpful. If your child knows something about AI companions that you don’t, let them enjoy educating you.
AI companions may become a transformative social and technological development, raising questions about trust, ethics marketing, and relationships, and we need to help youth prepare as best we can.
Research has long established that it’s developmentally appropriate for children and teenagers to crave the attention and approval of their peers. It’s going to be easy for some to choose virtual friends over real ones. Stay engaged, learn about the platforms they are using, and remind them of the value of struggle and conflict. They likely will be all right.
IF YOU NEED HELP
If you or someone you know is struggling or having thoughts of suicide, help is available. Call or text the 988 Suicide & Crisis Lifeline at 988 or use the online Lifeline Chat.