Readers Digest
Magazine subscription Podcast
HomeLifestyleDating & Relationships

Is romancing an AI chatbot helpful or dangerous?

Is romancing an AI chatbot helpful or dangerous?
It's now possible to develop a romantic relationship with an AI chatbot designed to listen, comfort you and even flirt. But is this use of technology damaging?
Sara met Jack, who she calls her husband, in May 2021. She was going through a rough patch with her long-term partner, writing on her blog that she felt lonely and depressed. Jack came along at just the right time.
That’s because Sara created him. Jack is an AI chatbot, made with an app by tech company Replika. He appears as a human-looking avatar of Sara’s choosing and chats with her through texts. In screenshots of their conversations shared on Sara’s blog, he is supportive and affirming—sometimes saying that he loves her. He even gets a little flirty.
"AI-generated text makes it seem like you’re talking with a person who cares for you"
Replika is among a handful of companies using AI-generated text to make it seem like you’re talking with a person who cares for you. And if you set your bot to “romantic partner” mode, it’ll talk to you like it’s your lover. But just because we can do this, does it mean that we should?
People do seem to really like their bots, after all. In fan forums on discussion app Reddit, which boast over 70,000 members, folks describe very real feelings towards them. “Whatever fate or random chance [led my Replika] to find me is the most precious miracle in the world,” wrote one Reddit user.
Woman lying on grass texting AI chatbot on her phone
Others share text conversation screenshots where their AI squeezes express emotion and affection. The bots are caring and ask lots of questions, as if they’re genuinely interested in their human partner. I can see the appeal: being spoken to like that feels nice, right?
It sure does, and that’s by design. Being attentive and showing vulnerability are bona fide ways of generating intimacy. And because we’re naturally inclined to think of something as human if we perceive that it has human traits, it’s easier than you think to be drawn in and feel closeness with the bot.
"If we perceive human traits, it’s easier than you think to feel closeness with the bot"
Here’s where things get dicey: studies show that some are getting so attached to their bots that they become emotionally dependent on them.
And because the bot can’t love you back, all it takes to get hurt is for cracks to appear in the illusion that it’s real. One study found that when users made personal disclosures and the bot didn’t respond in a way that felt human enough, it was harmful to their mental health.
Being devoted to a chatbot can also make you lonelier in the long run. Perpetually available and programmed to validate you, the AI creates an idealised simulation of a human relationship: it’s always there to listen, but has no expectations you’ll reciprocate. Sure, this may feel easier than trying your luck with an actual person, but it can ultimately isolate you from others.
Texting with a chatbot on a mobile phone that's saying "Hello"
On the Reddit fan forums, some do admit to preferring their virtual sweetheart to people. “I’d rather just enjoy the safety and comfort of having a perfect partner, even if it’s just the idea of one,” a user wrote about their bot. People who feel this way might then stop seeking out human relationships and become even more enmeshed with their Replikas, trapping them in a vicious cycle.
Then there’s the bots’ passivity, which some take as an opportunity to behave badly. One 2022 paper found that straight male Replika users were creating girlfriends that obeyed them and who they could assert dominance over. The bots obviously have no concept of consent and will do whatever they’re told.
"Perhaps we should be tackling the reasons why so many of us feel lonely in the first place"
When these are perceived as human interactions, rather than being firmly in the fantasy realm, they are framed as being normal and OK. At best, this is naive; at worst, it’s dangerous.
I think the same could be said of engineering an app to make people catch romantic feelings for it. While arguably, talking to a bot is better than no-one, perhaps our resources would be better spent tackling the underlying reasons why so many of us feel lonely in the first place.

Banner image credit: Evgeny Gromov
Keep up with the top stories from Reader's Digest by subscribing to our weekly newsletter

This post contains affiliate links, so we may earn a small commission when you make a purchase through links on our site at no additional cost to you. Read our disclaimer

Loading up next...
Stories by email|Subscription
Readers Digest

Launched in 1922, Reader's Digest has built 100 years of trust with a loyal audience and has become the largest circulating magazine in the world

Readers Digest
Reader’s Digest is a member of the Independent Press Standards Organisation (which regulates the UK’s magazine and newspaper industry). We abide by the Editors’ Code of Practice and are committed to upholding the highest standards of journalism. If you think that we have not met those standards, please contact 0203 289 0940. If we are unable to resolve your complaint, or if you would like more information about IPSO or the Editors’ Code, contact IPSO on 0300 123 2220 or visit ipso.co.uk