- Safe Space Online
- Posts
- AI Companions and Your Child's Privacy
AI Companions and Your Child's Privacy
We're Safe Space Online - a newsletter about keeping kids safe in the digital world. Every issue will aim to empower parents, guardians & educators with the knowledge and tools they need to protect children from online dangers.
SAFETY TIP OF THE WEEK
While many kids use AI Chatbots for entertainment alone, AI companions are still built to collect personal data from users. Encourage your kids to prioritize in-person relationships with their peers, and remind them to be careful about what they share with AI companions.
TODAY’S TOPIC
AI Companions and Your Child’s Privacy
If you’ve seen the 2013 movie, Her, in which Joaquin Phoenix plays a man who falls in love with an AI Chatbot voiced by Scarlett Johansson, you might wonder how anyone could find themselves building real companionship with an AI Chatbot. A rising trend among adolescents is exactly this: the use of AI Chatbots as companions. Kids can ask AI Chatbots questions and get responses that demonstrate care and interest in their lives that they may not get from their friend groups.
While many kids only use this to explore the capabilities of Chatbots, or to ask them entertaining prompts, this conversational output can become addicting. Some kids have admitted to asking Chatbots questions that they would not ask their parents or friends, which can allow the Chatbot to occupy a place of informational and emotional need in a child’s life. Additionally, the information that a child gives to the Chatbot, whether for entertainment or emotional support, is collected by the company providing the Chatbot. That information is typically sold to other tech companies that use it to create targeted content and advertisements for you and your child. Recently, a teen used a chatbot companion to discuss suicidal thoughts, which resulted in his actual suicide.
WHAT CAN YOU DO?
A child who turns to an AI Chatbot for emotional support or advice may feel ashamed about whatever it is they need that support or advice for. Whether they want to know more about romantic relationships or their own mental health, shame and social pressures make it especially challenging for kids to be open with their parents and peers about their needs. If the child becomes embarrassed of the fact they are asking the Chatbot serious personal questions, they will be more inclined to seek their AI companion in secret, where parents can’t monitor their behavior with the Chatbot.
Keep an eye out for the following behaviors, as these may indicate that your child is building a relationship with an AI Chatbot:
They Rely on AI or Social Media for Socializing: Ask your child about their friend groups and make space for them and their friends. If you find that your child does not have many friends or spends more time using social media or the internet than building healthy, in-person relationships with their peers, they may be seeking companionship from Chatbots.
They Become Withdrawn from In-Person Social Activities: If you notice your child prefers to be alone most of the time and guards their personal time energetically, they may be hiding conversations they have with AI Chatbots.
They Display Contempt for Or Anger at In-Person Relationships: If your child seems negative and jaded about the idea of making new friends, they may be filling their social needs with an AI Chatbot.
If you notice any of these behaviors in your child, it is still possible to intervene. Below are some tips for having conversations with your child about their use of AI Chatbots.
Have Open and Honest Conversations with Your Child about Chatbots: Know that most kids aren’t building earnest relationships with Chatbots, but most kids do engage with them. Remind your child that the information they produce after a prompt is not always accurate or true. The Chatbot builds answers from the list of prompts the user gives it. Refining prompts can sometimes have the effect of rendering the information they provide utterly useless and untrue. You should also remind your child that Chatbots are collecting their information: the conversations they might have with the Chatbot is not more private than a conversation they might have with a friend or parent in confidence.
Establish a Safe Environment for Dialogue: If your child feels ashamed of some aspect of their life and goes to a Chatbot to help get answers, they may need to be reminded that you are a safe person to talk to about any troubles they are having. It might be awkward to talk about romantic relationships or mental health if you don’t feel equipped to give them good advice. You don’t have to be an expert, but listening to their perspective can help you both find a solution that is safe and positive for your child.
Try Not to Give Entirely Negative Feedback: You may disapprove of your child’s romantic interests or their appraisal of their own mental health. However, responding to them with only criticism is likely to alienate them and break down the safe space you want to build with your child for tough conversations. Try to understand where they are coming from, and give positive feedback as well as advice. Acknowledge that their experience of the world is different and will continue to be different from yours. As their parent, your perspective is valuable and it has the wisdom of experience, but remember that kids today are growing up with technology that you may not have ever had so much access to.
▶ Slang Word of the Week: “Cringe” adj. – used to describe an act or a situation that makes one uncomfortable. It is often used when someone says or does something that is out of touch with what is cool.
