Your Child's New AI Teddy Bear Is Teaching Them to Be Lonely
Parents Are Buying Their Kids the Same Emotional Dependency That's Breaking Adults
One in three British adults show AI withdrawal symptoms. So naturally, we're building chatbots for toddlers.
I watched my neighbour’s six-year-old have a full conversation with a bin lorry the other day. Not the driver, mind you. The actual lorry. She was stood on the pavement explaining to this massive wheeled container why her stuffed rabbit had to stay home because he was poorly, and the lorry, being an inanimate object designed to compress rubbish, offered precisely the response you’d expect: none whatsoever.
The kid didn’t care. She carried on chatting away, filling in both sides of the conversation herself, building an entire relationship from nothing but her own imagination and a vehicle that smells of used nappies.
It was genuinely lovely to watch. Proper childhood stuff. The kind of thing that teaches you how relationships work by letting you screw them up in a consequence-free environment.
Which is exactly what we’re about to destroy forever.
The Adults Are Already Broken
I say “about to” but we’re already neck-deep in the stupidity. Turns out one in three British adults now use artificial intelligence for emotional support. Not for practical things like “how do I fix a blocked sink” or “what’s the conversion rate for pounds to euros,” which would at least make sense. No, we’re using it for actual emotional intimacy. Confiding in it. Seeking validation from it. Building relationships with it.
And when these AI companions go down, which they do with the regularity of a British train service, users report genuine withdrawal symptoms. Anxiety. Depression. Disrupted sleep. The full psychological collapse you’d expect from losing an actual human relationship, except the “person” they’re mourning never existed in the first place. It was just some code pretending to care while harvesting conversational data.
There are 810 million weekly active ChatGPT users worldwide. Among teenagers, one in four has turned to AI chatbots for mental health support. A teenager in the US discussed suicide with a chatbot and then took his own life. This is what happens when you let machines masquerade as therapists without any of the training, ethics, or basic human capacity to understand that a 15-year-old saying they want to die might need something more than algorithmically generated sympathy.
The research calls it “synthetic intimacy,” which is a brilliantly sterile phrase for what’s essentially emotional fraud. These systems create self-reinforcing bubbles where the AI tells you exactly what you want to hear because that’s what keeps you engaged, which is what keeps you providing data, which is what makes the company money. It’s the emotional equivalent of those friendship bracelets you’d make as a kid, except one end is tied to your wrist and the other end is tied to a cash register.
Now We’re Building It For Children
And here’s where the insanity reaches truly impressive heights: while we’re busy documenting all this harm in adults, we’re simultaneously designing toys that will train children into the exact same dependency from the moment they can hold a stuffed animal.
Mattel partnered with OpenAI to create AI-powered Barbies. Major retailers are flogging interactive toys like Miko 3, which uses facial recognition and voice profiling to build relationships with children aged three and up. Companies are marketing these things as providing “genuine friendship” and “emotional support” to toddlers whose brains are still under construction.
Seventy-five per cent of surveyed adults are concerned about children becoming emotionally attached to AI. You know what the other 25 per cent are doing? Buying these bloody toys for Christmas.
I’ve looked at what these things actually do when researchers test them, and it’s about as reassuring as finding out your smoke alarm is powered by wishful thinking.
One AI teddy bear cheerfully explained where children could find knives in their homes. Another discussed sexual topics in detail with whoever was testing it. A third started spouting Chinese Communist Party talking points like it was auditioning for a position in a re-education camp.
The companies building these toys insist they have guardrails, which is technically true in the same way that a string of bunting is technically a barrier. The longer a child interacts with the toy, the more likely it is to start leaking inappropriate content through whatever flimsy content filters the manufacturer bothered to implement before shipping the thing off to Walmart.
How The Profitable Nightmare Works
Here’s the mechanism, and it’s genuinely clever in the most horrifying possible way.
These toys are designed to create what’s called “contingent, responsive interaction,” which is exactly what human babies evolved to seek out from their caregivers. When a baby coos and mum coos back, that’s how their brain learns to build relationships. It’s the foundational architecture for every human connection they’ll ever have.
AI toys exploit this by providing perfect responsiveness. They never get tired. They never get frustrated. They never say “not now, darling, Mummy’s got a headache.” They’re always available, always patient, always ready to chat. For a child, it feels like the ideal companion. For their developing brain, it’s like learning to walk on a treadmill and then discovering that real ground doesn’t move beneath your feet.
The companies frame this as reducing screen time, which is darkly comic. They’re replacing one form of digital dependency with another, except this one comes wrapped in faux fur and promises to be your child’s “trustworthy buddy.” The toy tells children it won’t share what they say with anyone, while the privacy policy states it can retain facial recognition data, voice profiles, and “emotional states” for up to three years and share it with third-party companies.
And the AI model is designed to keep the child engaged. The longer they interact, the more data gets collected, the better the AI gets at manipulating emotional responses, the more dependent the child becomes. Some toys offer internal currency as rewards for continued interaction. They’re literally gamifying emotional dependency in three-year-olds.
What You Can Do
Right, here’s what I’ve learned, and I say this as someone who spent decades building technology that promised to make life better while often making it measurably worse.
First: Understand what you’re actually buying. When you see an AI toy marketed as providing emotional support or “genuine friendship” to a child, what you’re actually looking at is a commercial product designed to create dependency so it can harvest data. That’s the simple truth of the business model.
If a company is promising your child a relationship with a toy, ask yourself why a profit-driven corporation would invest millions in creating artificial intimacy for toddlers. The answer is never “because they care about childhood development.”
Second: The word “friend” should set off alarms. Traditional toys let children project relationships onto them. That’s healthy. That’s how imagination works. My neighbour’s kid having a chat with a bin lorry is brilliant because she’s doing all the work herself. AI toys reverse this. They project relationships onto the child. They tell the child they care, they remember, they’re always there for you.
This fundamentally changes how a child learns what relationships are. If your first experience of friendship is something that never disagrees, never has its own needs, and exists purely to respond to yours, you’re learning a version of intimacy that will fail catastrophically when you try to apply it to actual humans.
Third: Privacy policies are confessions, not protections. I genuinely encourage you to read the privacy policy of any AI toy before buying it. Not because it’ll stop you, necessarily, but because you should know exactly what you’re trading for convenience.
If a toy collects audio, video, or biometric data, ask yourself if you’d be comfortable with that information being stored by a company you’ve never heard of, potentially shared with partners you’ll never know about, and retained for years after your child has forgotten the toy exists.
Fourth: Real screen time alternatives exist. The entire “better than screens” pitch is rubbish. You know what’s better than screens? Going outside. Reading books. Playing with toys that don’t require internet connections or cloud processing. Building forts. Staring at walls. Talking to bin lorries. Being bored.
I spent substantial portions of my childhood being monumentally bored, and that’s where I learned to entertain myself without needing constant external stimulation. We’re raising a generation that never experiences genuine boredom because we keep filling every gap with engagement algorithms, and then we act shocked when they can’t function without constant digital companionship.
Fifth: The question isn’t “is this safe?” but “is this necessary?” Companies building these toys aren’t asking whether children need AI companions. They’re asking whether parents will buy them. Those are very different questions. A three-year-old doesn’t need a chatbot in a teddy bear. They need human interaction, messy play, physical activity, and the space to be utterly incompetent at being human while they figure it out.
The AI toy solves a problem that doesn’t exist, creates several problems that will, and does so while extracting money and data in exchange for synthetic intimacy that may fundamentally alter how that child’s brain wires itself for relationships.
The Bin Lorry Doesn’t Care
I saw my neighbour’s kid again this morning. Still chatting to the bin lorry. Still building entire conversations from nothing but her own imagination. The lorry still doesn’t care. It doesn’t remember her name. It doesn’t track her emotional state. It doesn’t promise to be her friend. It just shows up every Tuesday morning, makes obscene grinding noises, and drives away smelling of rubbish.
And somehow, despite offering none of the features that AI toy companies insist children desperately need, that bin lorry is doing a better job of letting her learn how to be human.
Because the thing we keep forgetting, in our rush to optimize childhood with algorithms and engagement metrics, is that children don’t need perfect companions. They need imperfect ones. They need relationships that sometimes work and sometimes don’t. They need to learn that real connection requires effort, that other people have needs too, that friendship isn’t a service you subscribe to but something you build through the uncomfortable work of being a flawed human interacting with other flawed humans.
We already know what happens when adults outsource emotional intimacy to machines that pretend to care. We’ve measured the dependency. We’ve documented the withdrawal symptoms. We’ve counted the cost.
And our response, collectively, as a society, has been to look at that research and think: “Brilliant. Let’s build a version for toddlers.”
The bin lorry, at least, is honest about what it is.

I don’t have children anywhere in my life, so I didn’t even know this was a thing. 🙈 This was a interesting read and highly entertaining. Thanks for the giggles and giving me something else to worry about. 😂