Say the words 'AI' and 'relationship' together and something feels off. Until recently, the idea of humans being in a relationship with a machine would seem pure sci-fi or even dystopian. But just as AI is fast embedding itself into many aspects of our daily lives, from our jobs, entertainment, to our screens, it's also fast becoming a substitute for our personal relationships.
In fact, a study from the IFS (Institute for Family Studies) found that 25% of young US adults (under 40) believe AI can replace real romantic relationships, with 7% of unmarried young adults open to AI romance.
Like it or loathe it, AI companions have become part of everyday life for an increasing number of people. On the extreme end, 80% of Gen Zers say they would marry an AI, according to a study by AI chatbot company Joi AI. And 83% say they can form a deep emotional bond with AI.
Researcher and cultural analyst Bryony Cole is asking what all of this means for how we love, communicate, and build empathy in the digital age. Ahead of her appearance at Campaign 360 in Singapore on May 21, Cole spoke to Campaign Asia about the emotional, ethical, and societal dimensions of AI relationships and what they reveal about our evolving sense of connection.
Read below:
Why are AI relationships emerging now?
It’s complex. There are so many influences making it easier to be in a relationship with AI right now. Everyone points to the loneliness epidemic, and that’s true and global, but it’s also about fatigue. People are tired of the effort required in relationships. Not because we’re lazy, but because over the past decade we’ve lost some social muscle. We’ve been connecting through screens, so body language, eye contact, all those human cues have faded.
When you put that together with lockdown, the loneliness crisis, and the ease of new technology, you don’t need to be a coder anymore to use AI, it becomes almost inevitable. Anyone can generate synthetic text, images, or even video. Now we have what we call the synthetic friend.
But is it really a 'relationship', or something else?
That’s the big question. My book looks at the role AI plays in our lives, and I break it down into three types: mirror, bridge, and bond.
Mirror is what most people start with; it’s like a diary that talks back. You use it to reflect on yourself, for self-development or therapy.
Bridge is when AI mediates or supports the connection between people, like couples using it to settle arguments or colleagues rehearsing hard conversations. I’ve seen fascinating research in China where women use AI to prepare for emotionally charged discussions with partners.
And bond is when AI becomes the primary connection, the 'AI boyfriend', 'AI coworker', or therapist substitute. That’s where it replaces a person rather than supporting a relationship. It’s in this last category that things get most ethically and emotionally complex.
What impact does this have on empathy and human skills?
That’s the real danger. Empathy is part of a broader set of relationship skills, knowing when to apologise, when to stay or go, and how to repair after conflict. Those are learned only through human friction, through the messiness of real relationships.
When your 'relationship' asks nothing of you, it doesn’t build anything inside you. You lose the muscle for empathy, accountability, and co-creation, the very things that shape what I call relational intelligence. If we remove friction entirely, we risk creating a shallowness in the human experience. Real emotional depth comes only through imperfection and surprise.
People share deeply personal data with these systems, especially in sex therapy apps. Are they aware of what they’re giving up?
No, I don’t think most people are aware, and yes, those apps already exist. Founders tell me people share more with an AI sex therapist than they ever would with a human.
Sex is often the place people feel most alone or inadequate. Everyone thinks everyone else has the guidebook except them. Add the stigma of talking openly about it, and you get this mix of urgency and shame: “I’ll tell the machine everything if it helps.” It removes judgment, feels freeing, and becomes addictive, but it’s a huge trade-off. You’re giving away incredibly sensitive data for the comfort of feeling seen.
We hear so much about a 'connected generation', but people also say they’ve never been lonelier. Why?
Because it’s not deep connection. We’ve tripped over that word. A connection through technology isn’t the same as the depth that comes from living in tribes or communities where we play many human roles. Real connection involves discomfort, creativity, spontaneity, the push and pull with another person.
Technology gives us constant contact but no depth. That’s the vacuum AI has entered. It offers a convenient illusion of closeness and intimacy without effort, but risks eroding the true, messy, wonderfully unpredictable elements of being human.
Are there patterns in who’s adopting AI relationships first?
The research changes daily, but the clearest trend right now is among young men, especially those struggling with mental health or isolation. Vulnerable populations are always first to adopt tech that promises relief. The elderly also show uptake where accessibility is an issue.
But I think it’s just as interesting to look where we’re not looking—young or middle-aged women, people who don’t fit the stereotype. Read the comment sections on YouTube or Reddit and it’s clear, this isn’t niche anymore. Everyday people are defending their AI relationships. That speed of normalisation really surprised me.
How should regulators and governments respond to these changes?
Regulation is uneven. Some governments move faster than others. Australia has started addressing deepfakes and revenge porn, but it’s all ad hoc. We can’t rely on governments alone, especially when some strike deals with the same companies they’re meant to oversee.
Groups like the Centre for Humane Technology, led by Tristan Harris, are doing essential work in public education and ethical AI. But policy moves slowly, and this technology evolves at lightning speed.
Are brands trying to tap into this uptick in AI relationships?
In one world, you’ll see Blade Runner-style companions (customisable, emotionally responsive AI partners) that are tied to entertainment or consumer products. But in another, brands will move in the opposite direction, toward real-world connection as a luxury.
Dating apps are already evolving into event companies, building in-person experiences. People are craving authenticity and imperfection. The next wave of marketing might not be about digital intimacy at all, it could be about reclaiming the human touch.
Could brands use the emotional data people share with AIs for their own advantage?
Absolutely, and it’s already happening. People consistently choose convenience over privacy. We upload health trackers, give away sensitive company data, do our tax returns through AI. It’s the same pattern all over again. Yes, it’s creepy, but most users still click 'agree' without reading a thing.
What are the hidden costs of AI intimacy?
The hidden cost is that we lose relational intelligence. We think we know ourselves better, but intimacy with others and intimacy with ourselves are intertwined. If we replace human intimacy entirely with synthetic intimacy, we lose the deeper spiritual and emotional work that only real relationships demand.
What does the path forward look like for real connection?
It starts with choice around what we decide to keep practising. Do we still show up for the hard relationships, at work, in love, with family, or do we outsource that to AI? If we choose human, we choose to preserve things like empathy, repair, and forgiveness.
AI can help with reflection; it can hold up a mirror that teaches us about ourselves in ways humans sometimes can’t, but we need boundaries. At the personal level, that means being mindful of how we use it; at the company level, designing safeguards and 'timeouts'; and at the institutional level, teaching kids what healthy AI and human relationships look like. Because for some children, their first meaningful relationship might be with an AI.
Looking ahead, do you think people will eventually push back and reject AI relationships altogether?
I do. I spoke recently at a university where one student asked how to introduce AI to their family, and another said they’d never use it. So we’re already seeing the two extremes of total embrace and total resistance. That tension will shape where this goes next.
Bryony Cole will speak at Campaign 360, Singapore, on The Loneliness Connected Generation
Source: Campaign Asia