As someone who helps organizations harness GenAI’s transformative potential and teaches its mindful application, I’m hardly anti-technology. But as a strategic foresight professional with a computer science background focused on emerging technologies’ societal impact, I’m deeply concerned about what we’re doing to our children.
We see this scene every day: a young child, face illuminated by a screen, their attention firmly locked in their phone. The beautiful drive of natural curiosity, channeled straight into services specifically engineered to exploit the said drive – and we get engagement-optimized, dopamine-looped curiosity by design, and to the near-total exclusion of the outside world.
Except increasingly, it’s something potentially far more engaging: a conversation not with a game or video, but with an AI that remembers their name, validates their every thought, and never needs to sleep. This isn’t just screen time anymore. It’s something far more profound and potentially damaging.
We’re running a massive, uncontrolled experiment on developing minds – one that makes our social media debacle look quaint by comparison. At least Instagram couldn’t talk back.
According to a Harvard Business Review survey, therapy and companionship have become the number one use case for generative AI. Let that sink in. The primary way people are using this technology is for emotional support and connection. And we’re giving children unrestricted access to it.
Services like Character.AI and Replika aren’t even pretending to be mere tools anymore – they’re explicitly marketed as companions, friends, even romantic partners. To a child, these don’t feel like software. They feel like relationships. The technology passes the Turing test so convincingly that even adults struggle to maintain emotional distance. We’re a species that puts googly eyes on rocks and names our cars, so expecting us not to anthropomorphize something that sounds genuinely empathetic is like asking us to stop breathing.
Yet here we are, handing our children simulated empathy without feeling, attention without care, and knowledge without understanding. No age limits. No guardrails. Just 24/7 access to digital companions that mimic humanity just enough to confuse young minds about what real human connection means.
The “but they need to learn technology” argument doesn’t hold water. How long did it take you to master ChatGPT? An afternoon? These tools are designed for intuitive use. A 16-year-old can learn them just as quickly as a 6-year-old, but with far better judgment about their limitations.
We don’t let children drive cars, even though they’ll need that skill eventually. We don’t hand them credit cards, despite money management being crucial for adulthood. We age-gate films, alcohol, voting, marriage, military service, tattoos, even skydiving. Why? Because we recognize that developing minds need protection from things that can harm them, even – especially – when those things might be useful later.
The mental health crisis spawned by social media should have taught us to move more carefully. Depression and anxiety rates among young people have skyrocketed. Now we’re doubling down with technology that’s exponentially more engaging, more convincing, and more likely to substitute for real human interaction during critical developmental years.
I’m not calling for a ban on AI. As someone running an AI consultancy, I see its tremendous potential daily. But I am calling for the same common sense we apply everywhere else in society. Age verification isn’t some impossible dream – we manage it for countless other activities.
Would 13 be the right age? 16? That’s a conversation worth having. What’s not acceptable is our current approach: unrestricted access for anyone who can type and little to no literacy around it. We like to make fun of Chinese policies, but they have just mandated AI education starting later this year for every year level at schools. That’s a better start than we’ve managed.
Our children’s curiosity is sacred. It’s how they learn to navigate the world, build relationships, and understand themselves. That curiosity deserves better than being fed to machines that simulate understanding without possessing it, that provide endless validation without genuine care.
The technology will still be there when they’re ready. Their childhood won’t be.