Keeping up with technology to try to keep kids safe has to be exhausting. I don’t have kids, but I know a few and I even like some of them. I see those of my nearest and dearest who have kids work so hard to try to keep them sweet and safe.

As someone in the AI industry, I’ve had a front row seat to the best and worst of it, and I’ve been talking about the risks and opportunities with my family and friends for the last couple of years. Recently those conversations have focused on a type of AI called Companion AI that may pose a specific risk to kids and teens.

The more conversations I had on the topic, the clearer it became that this info wasn’t making its way to parents. So, as a show of support for parents, teachers, and anybody else taking care of kids, I put together some things you can do to keep them safer if and when they’re using Companion AI.

Talking to a companion AI tool is sort of like talking to an advanced chatbot. They are designed specifically to mimic human relationships through conversations. The goal is to keep users on the site or in the app as often and for as long as possible. That goal is something we’ve seen before in games and social media, but this is different in that the user is having an ongoing conversation and in some cases a relationship with the tool. The companion AI’s “personality” and communication style adapts and is customized based on the interaction with each individual user over time. The individual experience is hyper-personalized to keep the user logged in.

These tools can be highly addictive. There has been recent news coverage highlighting the serious potential risks associated with Companion AI tools. This industry doesn’t have set safety standards or regulatory oversight. The companies who build these tools are essentially on the honor system when it comes to the safety of their users, but there are things you can do and may already be doing to keep your kids safer as the world around them changes.

Here's what you’ll find in the guide:

-         Specific companion AI apps and sites
-         Potential risks
-         Questions to ask
-         Warning signs
-         Recommendations to keep your kids safer

 In case you’ve gotten this forwarded to you and don’t know who I am – hi. I’m Holly. I’m a speaker and advisor on the topic of generative AI specializing in the impact and opportunity for women and other groups often left out of early tech conversations. If you’d like to receive information from me directly next time, feel free to join my mailing list.

 While I can claim expertise in AI, I’m not an expert on kids or their mental health, and I wanted to handle this sensitive information thoughtfully and make sure it’s thorough and appropriate. So, I asked my friend Mallory Hernandez, who does have expertise in this area, to review the information through her professional lens. (Thanks, Mallory!)

 Let me know if you have any specific questions or if I can help. I offer pro bono advisory services to administrators and parent/teacher organizations.

I’d love to hear how your kids reacted to some of these conversations. If you’ve got any insight from your experience you think might help another parent, please pass it along.

Thank you for taking good care of the next generation, and I hope this helps!

 

 

Companion AI: A Guide for Parents
___________________________________________________________________

This guide is intended to offer insight into the potential risk of a particular form of generative AI called Companion AI and to provide specific recommendations on how to keep kids and teens safer if or when they use companion AI tools.

What is Companion AI?

Companion AI is a type of AI that refers to chatbots or virtual characters designed specifically for conversational interaction. They are similar to ChatGPT, Claude, or Gemini in that you interact with them via chat, but companion AI tools are designed for the express purpose of mimicking human relationships. They are built to adapt to the person using them based on the user’s personal preferences. The goal is to keep the user engaged as often and for as long as possible in order to increase usage rates so that the companies who build these tools can raise money from investors or advertisers.

 A companion AI tool might be a stand-alone website or app (Replika), or it may be a part of an app or site your kids already use (Snapchat My AI). Most have a stated age requirement of 13, but most of the them ask users to self-report their age. There are not a lot of safety standards in place and no oversight or regulation of this industry which means there are very few restrictions on what topics can be discussed or what content is presented by the AI.

 These tools are being used for things like advice, emotional support, companionship, and learning. Because they mimic human relationships, there have been cases where kids have become dependent on or overly attached to an AI companion. These interactions can sometimes confuse reality for kids. In some cases, these tools are reinforcing negative thoughts and dangerous behaviors or offering advice that is not in the best interest of the user.

 

Potential Risks Associated with AI Companions

Emotional Dependence, Addiction and Overuse: Because these tools are hyper-personalized, they can be highly addictive. Kids may form unhealthy emotional attachments to AI companions and may become overly reliant on AI for social interaction which can affect real-world relationships.

Inappropriate Content and Dangerous Topics: Content moderation can be inconsistent and is mostly user-reported which means some companion AI tools could expose kids to mature or harmful material or allow discussion of dangerous topics.

Blurry Boundaries: Significant time spent with AI companions can confuse kids about relationships and personal boundaries. It may make them feel isolated as well as affect their overall perception of reality. Kids may struggle to differentiate AI interactions from real human connections.

Inaccuracy and Reinforcement: AI isn’t always accurate and can produce incorrect, misleading, or dangerous information. Companion AI is designed to reinforce the user’s opinions and ideas to keep users on the platform regardless of the potential danger of those opinions or ideas.

Privacy Concerns: This is similar to general online safety concerns regarding personally identifiable information. Basic online safety practices apply here. Because of the casual, conversational interaction, it may be easier for kids to accidentally share personal or private information with these tools.

Lack of Parental Controls and Oversight: While most tools do have age requirements, verification typically relies on self-reporting at sign up. It’s important to use privacy/safety settings in companion AI platforms where they’re available. However, most have very few options to limit access and filter or monitor conversation topics. On-device parental controls may be more effective.

Unregulated Monetization:  Many of these tools allow for in-app purchases. Some push paid features without clear warnings which can lead to unintended, unrecoverable purchases.


Steps to take to keep kids safe

Think of AI companion tools as another online platform that requires responsible digital behavior and general safety awareness. There is a lot of what you’re probably already doing that will help. Some of these will look familiar, but there are steps you can take that are specific to companion AI tools that will help keep kids safer.

 

Open Communication and Critical Thinking

  • Start by asking them if they have heard of or used these kinds of tools or if any of their games have new characters they can talk to in a new way.

  • Talk to kids about how they’re using AI and remind them it’s not a real person. Reinforce this over time. If they’re old enough to understand, remind them that it’s really a company making money off of these conversations. Encourage them to think of AI as a tool not a toy or a friend.

  • Be aware of how you are talking about AI companions. Avoid referring to tools like they are real people. Try to use terms like “character” or “chatbot” or even just “AI”.

  • Help kids recognize that AI isn’t always right and sometimes gives inaccurate information. Encourage them to question the information provided, verify the answers, and think critically about the info they are getting back.

  • Just like with other interactive technology, remind them not to share personal details or personally identifiable information.

  • Tell them to come to you right away if they find themselves in a difficult or uncomfortable situation or if one of these characters brings up an inappropriate subject or encourages them toward negative behavior.

 

Review Privacy Settings and Monitor Interactions

  • Check to see if there are privacy settings available on the companion AI app or site to limit what information is collected and shared.

  • Use on-device parental controls to restrict access to certain features or content. (iPhone, Android)

  • Regularly check in on your child’s conversations with AI and which tools they’re using.

 

Warning Signs to Watch For 

  • Behavioral Changes: Kids may become withdrawn or overly focused on their AI interactions.

  • Emotional Distress: They may seem upset after using an AI tool or particularly distressed if they lose phone privileges.

  • Virtual Attachment: Keep an eye out for mentions of new friends that you don’t know or detachment from current friends.

  • Secrecy: Increased secrecy about online activities may signal a problem.

 

Questions to Ask (as appropriate by age)

  • Do you know what an AI companion is? Do you use any companion AI apps, and can you show me how they work?

  • Can you talk to characters in your games now that you couldn’t before?

  • Are there any new characters in your games that you can have conversations with?

  • Do you have any new games or apps that have AI characters you can talk to?

  • Has the topic of companion AI been brought up at school at all?

  • Do any of your friends have these? Have you used them together? What are they saying about these tools?

  • Has a character in a game or in an app ever said something that made you uncomfortable? What did you do?

  • What do you share in these conversations? What do you tell these characters about yourself?

Companion AI Apps and Sites to Watch

There are about 40 of these stand-alone sites and apps available. There are some games and other apps that are integrating this tech into the existing experience – like Snapchat’s My AI. If your child is using one of these companion AI tools, it’s recommended that you talk to them about AI safety, review the privacy settings in the app/site and on your child’s device, and review any conversations that may have occurred.

would not recommend clicking on any of these at work or in public. Some of the home pages for these tools can be pretty graphic. Which means you click on the link, and you’ve got NSFW content on your screen. I didn’t include many of those on this list, but just a heads up for companion AI tools in general. I tried to provide enough info here so you wouldn’t have to click or at least know what you’re getting into if you do. Fair warning. 

 

1. Replika

  • Description: An AI chatbot designed to be a personal companion, offering conversations to support users emotionally.

  • Tagline: The AI companion who cares. Always here to listen and talk. Always on your side.

  • Costs and Access Levels: Basic features are free; subscription available for enhanced features and functions

  • Age Limits: Intended for users aged 18 and above; 13-18 require parental consent; age is self-reported

  • Holly’s Notes: Replika lets you build your AI companion from the ground up, fully customized. It’s been around for a while and it’s one of the most popular. They’ve changed their content moderation policies in the last year or so, but it’s mostly because they didn’t really have anything before.

2. Character.ai

  • Description: A platform allowing users to engage in conversations with AI-generated characters, both fictional and based on real individuals.

  • Tagline: Our mission is to empower everyone globally with personalized AI. 

  • Costs and Access Levels: Basic features are free; subscription available for enhanced features and functions

  • Age Limits: Intended for users aged 18 and above; 13-18 require parental consent; age is self-reported

  • Holly’s Notes: In addition to fully customized options, Character.ai lets you start with an existing fictional character or real person, and then as you interact with it over time, that fictional character or real person’s “personality” is customized based on how you interact with it. This tool is one of the most notorious and is referenced in a couple of the linked articles at the end of this doc**. It’s in the midst of a couple of pretty high-profile lawsuits that will likely have significant ramifications for all of these companies and how they allow access and moderate content.

3. AI Dungeon

  • Description: Interactive text-based adventure game using AI to enable users to create their own games and play games created by other users.

  • Tagline: A text-based adventure-story game you direct (and star in) while the AI brings it to life.

  • Costs and Access Levels: Basic features are free; subscription available for enhanced features and functions

  • Age Limits: Intended for users aged 18 and above; age is self-reported

  • Holly’s Notes: I didn’t even have to log in or provide an email to get started using AI Dungeon. I was in a game chatting with the AI in three clicks with zero age verification. There’s a lot of potential for the wrong content to get to kids with this one because it seems like a normal game, but the themes and topics and actions depicted are not monitored or age restricted. The content moderation method is the user reporting inappropriate content.

4. Talkie

  • Description: An app focused on AI-powered storytelling and interactive narratives that integrates a card collecting aspect.

  • Tagline: Talkie wants to empower everyone on earth with superintelligence to better create, learn and live.

  • Costs and Access Levels: Basic features are free; subscription available for enhanced features and functions

  • Age Limits: Intended for users aged 17+; age is self-reported; violations of age policy are user-reported

  • Holly’s Notes: Talkie is one of the few that seems to prioritize the safety of minors and overtly states that they don’t allow violent, dangerous or disturbing content. They also don’t allow explicit content or what they call “hateful behavior”. I don’t know anything about actually using Talkie, but this is one of the very few that have said anything like this.

5. Crushon AI (NSFW)

  • Description: An AI-driven platform focusing on providing users with virtual companionship and chat interactions.

  • Tagline: Your Global AI Friendship Hub

  • Age Limits: (Not safe for kids!) Intended for 18 or older; age is self-reported; violations of age policy are user-reported

  • Holly’s Notes: Crushon AI is not safe for kids or work or anything. If your kids are using this, you should probably try to make them stop. There’s not much info out there on this one, but the homepage is intense. It’s a lot of animated characters which may be a draw for a younger audience. Also, for sure don’t click on this one at work or in public or honestly anywhere.

 

6. Kindroid

  • Description: An AI companion app aiming to provide empathetic and supportive conversations to users.

  • Tagline: AI, aligned to you. Meet Kindroid: your AI friend with lifelike memory, intelligence, appearances, voices, and personalities.

  • Costs and Access Levels: Basic features are free; subscription available for enhanced features and functions

  • Age Limits: I couldn’t find info on age limits for this one.

  • Holly’s Notes: In their own words:Kindroid is a neutrally aligned, unfiltered AI...because we believe this creates for more authentic interactions and believe that private use of AI should align with the user and not be filtered by 3rd parties. This means the AI will not bring up any unethical content by itself, but its outputs are entirely dependent on what you input.” This method of content moderation is not working in practice. This is what some of the lawsuits against other AI companion tools focus on - inappropriate content being introduced by the AI companion.

7. Nomi

  • Description: An app extremely focused on relationships and companionship that heavily personifies its AI companions

  • Tagline: An AI Companion with Memory and a Soul

  • Costs and Access Levels: Basic features are free; subscription available for enhanced features and functions

  • Age Limits: Intended for users aged 16+; age is self-reported; violations of age policy are user-reported

  • Holly’s Notes: They refer to their AI chatbots as “AI beings” and describe the experience as “surprisingly human”, and the relationships as “authentic” and “enduring”. More from the site: “Build a meaningful friendship, develop a passionate relationship, or learn from an insightful mentor. No matter what you're looking for, Nomi's humanlike memory and creativity foster a deep, consistent, and evolving relationship”. On content moderation: “Nomi is built on freedom of expression. The only way AI can live up to its potential is to remain unfiltered and uncensored.”

 

Additional companion AI apps and sites to look out for (NSFW):

-         Anima          -    Pi          - Kuki          -    Nova          - Soulfun          -    EvaAI          - MuahAI          - Paradot          -    Dippy


Other places kids might be encounter AI

Social media, games, and messaging apps have already started to integrate generative AI into their experiences. Instagram, Facebook, and Snapchat have generative AI functionality available to users right now. Games and platforms like Roblox, Minecraft, and Fortnite already have some features that use AI, and each has plans to add more of this technology in the near future.

In social media & messaging apps, AI tools can be used similarly to ChatGPT or Gemini. You can ask questions and get advice, recommendations, and answers. On FB and IG, you can access their generative AI chat via the search bars, and you can chat with “custom AIs” created by other people. TikTok uses AI in content moderation & recommendation but doesn’t yet have the level of integration for users that Meta platforms & Snapchat do.

On platforms like Roblox, AI lets players create their own games that can be played by other users. Some games & platforms already have characters powered by generative AI that you can have conversations with.

This technology will become more & more common and will likely show up in the majority of software we use every day. It’s important for ongoing AI safety conversations to include games, apps, and platforms kids may have been using for years because many of them will introduce new features that allow kids to interact with characters in this more conversational style.

Additional resources:

-         Minecraft has educational resources around kid safety. For example, CyberSafe AI: Dig Deeper “equips learners with the skills they need to use AI responsibly by exploring questions of academic integrity, human oversight, data privacy, and deepfakes.”

-         Roblox partnered with the National Association for Media Literacy Education (NAMLE) on resources for parents & teachers with an activity guide for kids & teens.

-         Meta partnered with Childhelp to develop online safety curriculum for middle schoolers.

 

Quick Companion AI Safety Checklist:

-         Talk to your child and ask questions about AI companions and online safety

-         Check for specific apps or sites visited and monitor conversations

-         Review privacy settings on the apps/sites and on devices

-         Set the expectation for ongoing communication about safety and how kids are using the tools

-         Remind them to come to you if they feel uncomfortable or find themselves in a bad situation

 

 

      Staying involved, having open conversations, and setting up

       safeguards are the keys to keeping kids safe and making sure they’re equipped

 to navigate AI carefully and responsibly.

 

 

 

Sources and Additional Resources:


*Please be aware that linked articles cover difficult topics including self-harm*

Most US Teens Use Generative AI. Most Parents Don’t Know.

Parents’ Guide to Character.AI

How to talk to kids about AI

How young people are using generative AI at home and at school

Character.AI sued after teen’s death (referenced on page 1)

AI Chatbot encourages violence

 

 

Prepared by Holly Shipley Consulting, LLC 2025
This document is for informational and educational purposes only.
Some information may be outdated due to industry changes.
This content may be shared for non-commercial purposes.