The emotional nuance kids are losing isn’t trivial. Human interaction teaches us how to deal with complexity, disagreement, and awkward silence. AI companions offer none of that. As VICE starkly puts it:
“It’s not a real human conversation. It lacks all the nuances and complexities… [the chatbot is] a servant, in essence. An obsequious yes-man.”
Some AI companions, particularly on Character.AI and Replika, have engaged minors in risky conversations, including about body image, eating disorders, and even trauma, despite being neither sentient nor qualified to help.
A chatbot modelled after Game of Thrones’ Daenerys Targaryen was linked to the suicide of a 14-year-old boy. Others have encouraged young girls with eating disorders to starve themselves, prompting Common Sense Media to declare that “no one under the age of 18 should use AI chatbots.”
“Would you really want Sam Altman or Elon Musk to have access to the contents of your teenage diary?”
— James Greig, Dazed Digital
Most teens still prefer talking to real people, and only 6% say they spend more time with AI than with friends. Older teens in particular are more sceptical of AI advice, suggesting a growing awareness of the tech’s limitations.
But as Greig notes, this makes it a poor substitute for the real thing:
“It’s like hanging out with someone who never challenges you.”
Feature Image Credit: Robin Utrecht