We’ve Finally Automated Parental Disappointment.
When Parents Outsource Emotional Manipulation to AI
I once realised I hadn’t called my mother in three weeks. Not because anything dramatic had happened, just because life had accumulated in that way it does, one day bleeding into the next until suddenly it’s been 21 days and you’ve somehow forgotten that other people exist outside your immediate bubble. The really depressing bit wasn’t the oversight. It was sitting there with the phone in my hand, genuinely annoyed with myself, hearing her voice in my head saying, “Well, I suppose I’m not important enough to warrant five minutes of your precious time.”
Which is why the news that Chinese parents are buying AI-generated videos of weeping middle-aged women to guilt their adult children into marriage feels both horrifying and inevitable. We’ve finally automated parental disappointment. Someone looked at the organic, homegrown tradition of making your kids feel rubbish about their choices and thought, “you know what this needs? Scale.”
The Regret-as-a-Service Economy
Parents in China who are desperate for grandchildren are commissioning or purchasing AI-generated videos of sobbing women in hospital corridors. These aren’t subtle. In one clip, a supposedly 58-year-old woman breaks down about not getting married when she was young, about having to attend medical appointments alone. In another, a 56-year-old wails about ignoring her parents’ advice, and now look at her, alone and miserable, proof that mum was right all along.
The videos circulate on Chinese social media platforms, shared by concerned parents the way my generation’s parents used to clip newspaper articles about house prices or the dangers of leaving things too late. Except now you don’t need to wait for the Daily Mail to run a scare story. You can just generate infinite variations of “this is your future if you don’t listen to me” and fire them off via WeChat.
I watched a few of these things. They’re clearly fake, the lighting’s wrong, the crying is too perfectly timed, but that’s not the point. The point is someone sat down and thought, “I could have an awkward conversation with my adult child about my fears and disappointments, or I could buy a video of a synthetic woman having a breakdown and let her do the heavy lifting.”
Where This Obviously Leads
The marriage videos are just the start. Once you’ve normalised buying AI content to manipulate your family, the logical next steps write themselves.
Your dad sends you a deepfake video of a 65-year-old man weeping in a cemetery because he never learned to play golf, and now he has no hobbies and no friends, just emptiness. “Don’t make my mistakes,” the AI golfer sobs. “Join a club while you still can.” Three days later, another video arrives. This time it’s a man in his seventies at a bowling alley, alone, because he chose the wrong sport. “Golf, son. It should have been golf.”
Your mum discovers you can commission videos about career choices. Suddenly, you’re receiving weekly clips of regretful middle-aged people in various states of professional despair. A woman in a bedsit lamenting her degree in fine art. A man in a call centre weeping about not becoming an accountant. A perfectly generated simulation of someone who looks vaguely like you might in thirty years, standing in the rain outside a Job Centre, explaining in excruciating detail why creative writing was a mistake.
It escalates. Health choices, obviously. Your parents find a service that generates videos of people dying slowly from preventable diseases, all because they didn’t take their vitamins or go for regular checkups or eat enough fibre. “I ignored my body,” wheezes an AI-generated man in a hospital gown, “and now look at me. Don’t be like me. Eat your greens.”
But here’s where it gets properly mental. The technology’s cheap enough that it spreads beyond just parents. Your ex-girlfriend’s mother sends you a video of a divorced man in his fifties explaining how he sabotaged the best relationship he ever had. Your mate’s dad commissions a clip of someone who never learned to change a tyre, now stranded on the M6 in the rain, crying about self-sufficiency. Your gran gets involved, starts sending you videos about the importance of thank-you cards.
The Bit Where I’m Supposed to Help
I’ve made every mistake that leads to receiving these kinds of videos, both as the disappointing child and later as the disappointed parent. Here’s what I learned.
If you’re tempted to send someone an AI guilt trip, ask yourself this: do you actually want a relationship with this person, or do you just want them to do what you tell them? Because buying synthetic regret and forwarding it on isn’t communication. It’s just outsourcing your inability to be vulnerable. Your kid will respond to it the same way you’d respond to someone sending you a video of a lonely pensioner who didn’t invest in Bitcoin, with a swift trip to the block button.
What works, annoyingly, is being honest without demanding results. “I’m scared I won’t see grandchildren before I die” is real. “Watch this AI woman cry in a hospital” is weird and cowardly. One might start a conversation. The other guarantees you’ll spend Christmas alone watching your own AI videos about parents who alienated their kids through synthetic emotional manipulation.
If you’re receiving these things, here’s the truth: you don’t have to watch them. You’re not obligated to consume the guilt content your parents have commissioned. You can just say, “I’m not watching that. If you want to talk about why you’re upset, we can do that. But I’m not participating in whatever this is.” Then stick to it.
The Really Depressing Bit
The worst part isn’t that this technology exists. It’s that it’s probably already too late to stop it spreading. Within a year, every family disagreement will have its own library of AI-generated testimonial content. Vegetarian? Here’s a video of someone dying of protein deficiency. Moved to a different city? Here’s someone weeping about abandoning their hometown. Bought an Android instead of an iPhone? Here’s your future, a lonely figure unable to join group chats, ostracised from family photos, crying in the glow of green message bubbles.
We’ve built the infrastructure for infinite, personalised guilt. Every possible life choice can now be reinforced with a perfectly generated video of someone who chose differently and lived to regret it. And the really clever bit is that it doesn’t even have to work to be effective. The simple act of receiving these videos changes the relationship. Your parents aren’t talking to you anymore. They’re deploying weaponised regret and hoping something sticks.
Meanwhile, I’m still thinking about that phone call I didn’t make for three weeks. How I sat there genuinely beating myself up about it, because years of low-level criticism had taught me that even normal human forgetfulness was worth obsessing over. At least she had to do that work herself, one disappointed sigh at a time. The amateur quality of it made it something you could eventually learn to navigate.
Now other people’s mothers can just buy a video of a woman sitting alone by the phone, weeping about all the children who stopped calling, how the weeks turned into months turned into years of silence, until it was too late. They can deploy it with a single click and let synthetic regret do the guilting for them.
Progress, I suppose. Though I’m not sure for whom.
