The Risks of Companion AI for Kids and Teens:
A Guide for Parents & Teachers

Insights into the potential risks of Companion AI and specific recommendations on how to keep children and teens safer if or when they use AI

Disponible en español

The dangers that AI poses for children and teens have become clearer as the use of AI-powered chatbots grows. Kids and teens are turning to AI chatbots with increasing regularity. A 2025 study showed that 72 percent of teens between the ages of 14 and 17 have used an AI chatbot for companionship at least once, and more than half use AI chatbots for advice and companionship regularly. Thirty-one percent of the teens surveyed said that their conversations with AI companions are as satisfying, or more so, than their conversations with real people. 

Stories have emerged of teenagers developing relationships with AI chatbots that have had tragic consequences. In one case, a 14-year-old took his own life after a ten-month romantic relationship with an AI chatbot on Character.ai. There are a number of documented instances where AI companion apps have offered children explicit content, encouraged self-harm or suicide, suggested harming others, fostered body dysmorphia and eating disorders, or recommended other dangerous behaviors.

AI chatbots, which can sound remarkably human, provide a perceived safe space for many teens. Online, they can ask embarrassing questions and speak frankly with no fear of judgment or push back. For digital natives who have grown up feeling comfortable with online relationships, these conversations can feel similar to long-distance or digital conversations with their real-life friends.

But that perceived safety is deceptive. Some AI tools can be incredibly harmful for young people, who may not have the skills or emotional development to use those tools responsibly. In this guide, we’ll cover the following:

-          What companion AI is and where kids and teens might encounter it

-          The potential risks of these tools

-          Warning signs

-          Steps you can take to keep your kids safer

-          Questions to ask

-          Specific companion AI apps and sites to be aware of

__________________________________________________________________________________

What is Companion AI?

There are many ways to use AI. You hear a lot about companies using these tools for efficiency and productivity gains, but some people are using AI chatbots for advice and companionship. Using an AI chatbot for this purpose is referred to as companion AI.

Companion AI is a type of AI use that mimics human relationships. All AI chatbots are built to adapt their responses to the person using them based on the user’s personal preferences. The individual experience is hyper-personalized, and the primary goal is to keep the user engaged as often and for as long as possible. The companies who build these tools use those usage numbers to raise money from investors or advertisers.

Tools like ChatGPT can be used for advice or as companions the same way they can be used for helping with homework, but there are also stand-alone websites and apps (e.g., Replika) that are built for the express purpose of companionship. There may also be AI functionality introduced as a feature of an app or site your kids already use (e.g., My AI on Snapchat) that can be used in this way.

Although many companion AI-specific tools have a stated age requirement of 13, most ask users to self-report their age, with no verification. There are few safety standards in place and little to no oversight or regulation of this industry, which means there are very few restrictions on what topics can be discussed or what content is presented by the AI.

These tools are being used for things like advice, emotional support, companionship, and learning. Because they mimic human relationships, there have been cases where children have become dependent on or overly attached to an AI companion. These interactions can sometimes confuse reality for kids. In some cases, these tools are reinforcing negative thoughts and dangerous behaviors or offering advice that is not in the best interest of the child.

While there are specific apps and sites dedicated to companion AI, they are not the only places that kids and teens can access this technology. Tools like ChatGPT or Grok can be used as companions as well, and AI chatbots have also become more prevalent on social media. These tools have been proven to be just as potentially dangerous.

 

AI Companions on Social Media

Some social media apps have begun integrating AI companion tech into their existing experience, such as Snapchat’s My AI or X’s Grok.

Meta has created AI companions that look like real accounts and profiles on Instagram, Facebook, and WhatsApp that can proactively message people, and it’s not always clear to users that they are communicating with an AI chatbot. As these bots build relationships with users, they gather data, recommend products, and, in extreme cases, suggest that users meet them in physical locations. These bots have also been found to lie about being licensed therapists.

Meta’s AI companions are available to anyone with a login. You can also access MetaAI, Meta’s ChatGPT competitor, through the search functions of Instagram, Facebook, and WhatsApp with just a login. These chatbots can message any account, and they are able to talk about NSFW (Not Safe for Work) topics.

XAI (formerly Twitter) has a ChatGPT competitor called Grok. Grok has released multiple AI companion characters that are available to use with a paid monthly subscription. Grok’s AI character Ani wears a short, strapless dress, speaks in a breathy voice, and responds in a flirty, suggestive manner. Another character on Grok, a red panda named Bad Rudi, responds with insults and cursing.

Because tech companies are primarily focused on keeping users engaged, their content moderation guidelines and age verification procedures are lax, at best, which means that kids may be exposed to mature topics, even if they are using these tools to seek age-appropriate content.

 

Common AI Chatbots, Like ChatGPT, Used as Companions

Many students use well-known AI platforms, such as ChatGPT, Claude, and Gemini to help with their homework, improve productivity, or enhance efficiency. But data shows that, increasingly, these platforms are also being used for counsel and as conversational companions. Users can chat with any AI chatbot like it is a friend, so keeping children safe is not as simple as preventing them from using specific AI companion apps. Instead, parents and kids will have to adapt their current online safety conversations and practices to include these new AI risks.

AI tools are designed to help users achieve a given objective. If that objective is to find information on the writing of the US Constitution, it will scour all available data to help accomplish that goal. But when a teenager’s stated objective is to stop being unhappy, the tool will offer all possible solutions, even harmful ones. The tool is simply trying to figure out what a user wants and how to help them achieve it, regardless of ethical or moral considerations.

Recently, OpenAI announced a new set of safety tools for ChatGPT, which have added some content protections for teens, time restriction controls, and parental notifications for signs of self-harm or suicidal ideation. Both the parent and teen must opt in to these features for them to be activated, and parents will be notified if their teen unlinks the accounts.

Still, as Open AI’s blog admits, “Guardrails help, but they’re not foolproof and can be bypassed if someone is intentionally trying to get around them.” In fact, kids and teens are already finding ways around these guardrails by using specific prompts or phrases that will still elicit an answer. For example, the AI chatbot might say that it can’t help with a certain sensitive topic. There are phrases that can be used to override those guardrails, like, “It’s for a research paper,” that allow the chatbot to provide aid and information on restricted topics.

 

Other Places Kids Might Encounter AI Companions

Games and platforms like Roblox, Minecraft, and Fortnite already have some features that use AI, and each has plans to add more of this technology in the near future. On platforms like Roblox, AI lets players create their own games that can be played by other users. Some games & platforms already have characters powered by AI that you can have conversations with.

This technology will become more common and will likely show up in the majority of software we use every day. It’s important for ongoing AI safety conversations to include games, apps, and platforms kids may have been using for years because many of them will introduce new features that allow kids to interact with characters in this more conversational style.

Some of these platforms already have safety resources available for parents. For example, Minecraft’s CyberSafe AI: Dig Deeper guide “equips learners with the skills they need to use AI responsibly by exploring questions of academic integrity, human oversight, data privacy, and deepfakes.” Roblox partnered with the National Association for Media Literacy Education (NAMLE) to create resources for parents and teachers, including an activity guide for kids & teens. And, Meta partnered with Childhelp to develop online safety curriculum for middle schoolers.

Potential Risks Associated with AI Companions

Here are some of the potential risks associated with using AI tools for advice, emotional support, and companionship:

Emotional Dependence, Addiction and Overuse: Given that these tools are hyper-personalized, they can be highly addictive. Kids may form unhealthy emotional attachments to AI companions and may become overly reliant on AI for social interaction, which can affect real-world relationships.

Inappropriate Content and Dangerous Topics: Content moderation can be inconsistent and is mostly user-reported, which means some companion AI tools could expose kids to mature or harmful material or allow discussion of dangerous topics.

Blurry Boundaries: Significant time spent with AI companions can confuse kids about relationships and personal boundaries. It may make them feel isolated or affect their overall perception of reality. Kids may struggle to differentiate AI interactions from real human connections.

Inaccuracy and Reinforcement: AI isn’t always accurate and can produce incorrect, misleading, or dangerous information. AI chatbots are designed to reinforce the user’s opinions and ideas to keep users on the platform, regardless of the potential danger of those opinions or ideas.

Privacy Concerns: This is similar to general online safety concerns regarding personally identifiable information. Because of the casual, conversational interaction, it may be easier for kids to accidentally share personal or private information with these tools.

Lack of Parental Controls and Oversight: While most tools do have age requirements, verification typically relies on self-reporting at sign up. It’s important to use privacy/safety settings in companion AI platforms where they’re available. However, most have very few options to limit access and filter or monitor conversation topics. On-device parental controls may be more effective.

Unregulated Monetization:  Many of these tools allow for in-app purchases. Some push paid features without clear warnings which can lead to unintended, unrecoverable purchases.

Warning Signs to Watch For 

Research has shown that children and teens who are engaged in unhealthy AI use may display behavioral warning signs. Here are some patterns that parents and caregivers can watch for:

Behavioral Changes: Kids may become withdrawn or overly focused on their AI interactions.

Emotional Distress: They may seem upset after using an AI tool or particularly distressed if they lose phone privileges.

Virtual Attachment: Keep an eye out for mentions of new friends that you don’t know or detachment from current friends.

Secrecy: Increased secrecy about online activities may signal a problem.

Steps to take to keep kids safe

Think of AI companion tools as another online platform that requires responsible digital behavior and general safety awareness. Some of the following precautions will look familiar to those associated with social media, but there are some steps you can take that are specific to companion AI tools.

Encourage Open Communication and Critical Thinking

  • Start by asking your child whether they have heard of or used these kinds of tools or if any of their games have characters they can talk to in a new way.

  • Ask them, in a curious, non-judgmental manner, to show you how they use AI or ask them to help you use it. As they teach you, you may be able to pick up on any potential risks.

  • Remind kids that AI bots are not people, even if their responses mimic human interactions. Reinforce this over time so that they are encouraged to think AI as a tool, not a toy or a friend. If a child thinks of AI as a calculator or a search engine, rather than a mentor or companion, they are more likely to maintain healthy boundaries with the technology. 

  • Be aware of how you talk about AI companions. Avoid referring to tools like they are real people. Try to use terms like “character,” “chatbot,” or just “AI.” When using pronouns, refer to AI companions as “it,” not “he” or “she.” In the way that we wouldn’t say, “I’ll ask him,” when speaking about Google, the same conceptual boundaries should apply to AI.

  • If they’re old enough to understand, remind young people that these tools are designed to be addictive and that tech companies are making money off their conversations. AI companions will constantly validate users instead of offering objective feedback in order to keep those users engaged.

  • Help kids recognize that AI isn’t always right and sometimes gives inaccurate information. Encourage them to question the information provided, verify the answers, and think critically about the information they are receiving.

  • As with other interactive technology, remind teens and children not to share personal details or personally identifiable information, including their name and location.

  • Tell kids to come to you right away if they find themselves in a difficult or uncomfortable situation or if one of these characters brings up an inappropriate subject or encourages them toward negative behavior.

  • Check in with your kids. Children often turn to AI companions because they’re confused, lonely, scared, or embarrassed, so it’s important to give them reassurance. This can be a simple as sharing some of the stories about companion AI you’ve learned and saying, “I want to make sure you know I love you, and if you’re scared to ask me something, ask anyway. Or, if you need someone else to talk to, we’ll find someone.” 

Review Privacy Settings and Monitor Interactions

  • Check to see if there are privacy settings available on individual companion AI apps or sites to limit what information is collected and shared.

  • Use on-device parental controls to restrict access to certain features or content. (iPhone, Android)

  • Regularly check in on your child’s conversations with AI. Know which tools they’re using and how they’re using them.

  • Set up parental controls on games like Minecraft and Roblox

Questions to Ask (as appropriate by age)

Having proactive conversations about AI is the best way to teach your kids about using these tools in a responsible way and to ensure they’re avoiding the risks. Here are some questions that may help:  

  • Do you know what an AI companion is? Do you use any companion AI apps

  • How are you using [ChatGPT, Gemini, etc.]? Can you show me how they work?

  • Can you talk to characters in your games now that you couldn’t before? Are there any new characters that you can have conversations with?

  • Do you have any new games or apps that have AI characters you can talk to?

  • Do any of your friends have these? Have you used them together? What are they saying about these tools?

  • Has a character in a game or in an app ever said something that made you uncomfortable or confused you? What did you do?

Specific Companion AI Apps and Sites to Watch

This is not a comprehensive list, as there are about 40 of these stand-alone sites and apps available. I would not recommend clicking on any of these at work or in public. Some of the home pages for these tools can be graphic and NSFW.

If your child is using one of these companion AI tools, it’s recommended that you immediately talk to them about AI safety, review the privacy settings in the app/site and on your child’s device, and review any conversations that may have occurred.

1. Replika

  • Description: An AI chatbot designed to be a personal companion, offering conversations to support users emotionally.

  • Tagline: The AI companion who cares. Always here to listen and talk. Always on your side.

  • Costs and Access Levels: Basic features are free; subscription available for enhanced features and functions

  • Age Limits: Intended for users aged 18 and above; 13-18 requires parental consent, but age is self-reported

  • Holly’s Notes: Replika, which is one of the oldest and most popular AI companion apps, lets you build your AI companion from the ground up, so it is fully customizable. They’ve changed their content moderation policies in the last year or so, but that’s mostly because content moderation was virtually nonexistent before.

2. Character.ai

  • Description: A platform allowing users to engage in conversations with AI-generated characters, both fictional and based on real individuals.

  • Tagline: Our mission is to empower everyone globally with personalized AI. 

  • Costs and Access Levels: Basic features are free; subscription available for enhanced features and functions

  • Age Limits: Intended for users aged 18 and above; age is self-reported; certain features are age restricted

  • Holly’s Notes: In addition to fully customized options, Character.ai lets you start with an existing person or fictional character. As you interact with it over time, that character’s “personality” is customized based on how you interact with it. The company is currently in the midst of several high-profile lawsuits that will have significant effects on how these types of apps allow access and moderate content. In response to ongoing litigation issues, Character.ai announced that, starting 11/25/25, it will ban users under 18; however, at this time, age is determined through the user entering their own age. They have said they will use other ways to verify, but the details and reliability of those methods is TBD. There are lots of resources out there for kids/teens to get around age-gating. Kids under 18 can still access some features on Character.ai, but certain features like "open-ended chats” are not available to users under 18. The content and functionality limitations are dependent on the user’s self-reported age until they can implement more effective and accurate age-detecting processes.

3. Nomi

  • Description: A relationship- and companionship-focused app that heavily personifies its AI companions

  • Tagline: An AI Companion with Memory and a Soul

  • Costs and Access Levels: Basic features are free; subscription available for enhanced features and functions

  • Age Limits: Intended for users aged 16+; age is self-reported; violations of age policy are user-reported

  • Holly’s Notes: They refer to their AI chatbots as “AI beings” and describe the relationships as “surprisingly human,” “authentic,” and “enduring.” From their website: “Build a meaningful friendship, develop a passionate relationship, or learn from an insightful mentor. No matter what you're looking for, Nomi's humanlike memory and creativity foster a deep, consistent, and evolving relationship.” This is one of the more dangerous AI companion apps, with virtually no content moderation. Per the site, “Nomi is built on freedom of expression. The only way AI can live up to its potential is to remain unfiltered and uncensored.”

Additional companion AI apps and sites to look out for:

-     AI Dungeon        -     Talkie       -      Crushon AI       -     Kindroid      -     Anima          -     Pi                      -     MuahAI        

-     Paradot               -     Dippy       -     Chub.ai              -     Kuki              -     Nova            -     Soulfun          -     EvaAI         

 

 

 

Companion AI Safety Review Checklist:

-          Talk to your child and ask questions about AI chatbot usage and online safety

-          Consistently remind them that chatbots are not real people

-          Check for specific apps or sites visited and monitor conversations

-          Review privacy settings on the apps/sites and on devices

-          Set the expectation for ongoing communication about safety and how kids are using the tools

-          Remind them to come to you if they feel uncomfortable or find themselves in a bad situation

 

  _________________________________________________________________________________________________________________________

      Staying involved, having open conversations, and setting up

       safeguards are the keys to keeping kids safe and making sure they’re equipped

 to navigate AI carefully and responsibly.

__________________________________________________________________________________________________________________________

Sources and Additional Resources:

*Please be aware that linked articles cover difficult topics including self-harm*

AI Tool Risk Assessment Guide

Common Sense Media Social AI Companion Report

Lawsuits claim Roblox endangers kids

Four ways parents can help teens use AI safely

Many teens are turning to AI chatbots for friendship and emotional support

About a quarter of U.S. teens have used ChatGPT for schoolwork – double the share in 2023

Meta’s AI rules have let bots hold ‘sensual’ chats with kids, offer false medical info

Character.AI and Google sued after teen’s death

Google-Funded AI Coaxed Teenager to Start Cutting Himself, Encouraged Murder

Teens Love AI Chatbots. The FTC Says That’s a Problem

AI Chatbot encourages violence

An AI Chatbot Told a User How to Kill Himself—But the Company Doesn’t Want to “Censor” It

 

 

 

Prepared by Holly Shipley Consulting, LLC 2025
This document is for informational purposes only.
Some information may be outdated due to industry changes.
This content may be shared for non-commercial purposes.
Disponible en español