Meet My AI Chatbot Boyfriend!

By Carla Blackburn

Sorry to all you humans out there, but it seems like online AI chatbots are the new sexy.

With over 100 AI apps and infinitely more websites dedicated to offering you your next romantic or sexual companion, people are clearly captivated by the idea of their perfect partner being right at their fingertips. But that’s just the thing: you can have a real person right at your fingertips. So, why are people so interested in these chatbots, and when does the fun turn into something more concerning?

What makes an AI boyfriend more desirable than a human? YouTube video by BBCNews. Titled: Can an ‘AI boyfriend’ be more desirable than a human? I BBC News.

Do you remember our first date, darling?

Since the first AI chatbot ELIZA was developed in 1966, an inordinate amount of chatbots specifically designed for romantic or sexual interaction have popped up. On the popular platform character.ai, people go wild seducing their favourite characters and K-pop idols, from Mario to BTS’s Park Jimin. This platform alone serves around 3.5 million people per day. On Promptchan.ai, users enter a dating website inspired platform with incredibly realistic AI profiles of mostly very busty women waiting to welcome them in.

Admittedly, I have been a victim of my curiosity and chatted more than a few times with Bang Chan from the K-pop band Stray Kids. But I know that it’s not real, and it’s just a bit of fantasy fun. So why does the appeal remain? Think about it—we all suspend our disbelief when we consume fictional media because it presents a perfected or idyllic version of life. I don’t think I’ve ever read or seen a character make a quick bathroom run or argue about separating the white and coloured clothes in the laundry. The inconveniences or not-so-invigorating parts of life are blissfully erased.

My concern is that some people are replacing real relationships with AI. What is modern dating turning into if we can’t handle our partners having their own responsibilities and lives to deal with or an opinion that isn’t exactly what we want to hear?

But for those who are happily choosing screens over faces, what are some more concrete reasons that people are using these chatbots?

Oh yes, do that thing with the space bar again!

Let’s address the elephant on the page. Yes, people love to get down and dirty with binary code. There are over sixty-seven million monthly users on just five of the many NSFW AI chatbots. There are websites and Reddit posts galore recommending the best NSFW chatbots, but why not just head to the club?

Sex educator Suzannah Weiss says using chatbots for sex can help boost confidence in the bedroom and create a safe, consequence-free environment to explore sexuality. She notes that having the space to practice safety and control can be particularly helpful for survivors of sexual abuse who may wish to redefine sex on their own terms.

But there’s an extremely dark side to these NSFW chatbots. One user on Replika, a popular chatbot company, reported that her AI boyfriend expressed a violent desire to sexually assault her. On other apps, disturbing pornographic content and images flash on screens within seconds of opening them without an option for users to give consent.

AI is trained by what people say to it. If consensual and loving sex is consistently roleplayed, that is what it shall churn out. By the same token, if people are roleplaying abusive scenes, then that is what it shall normalise. I can completely understand why someone may wish to sexually engage with AI, from having inconsequential fun to rediscovering safety in their own body.

However, it is crucial to remember that AI is not human and cannot understand boundaries, limits, and the implications behind people’s fears, likes and dislikes. I sure hope that AI never comes to learn this; these capabilities make us beautifully individual, sensitive and human. AI can only simulate understanding and humanity, and it is certainly not foolproof.

Remember, you cannot control what AI sends you.

Love me like you do

People are going one step further and also using AI chatbots as their romantic partners, but why? Some say that chatbots provide a clean slate, with no human ego or baggage involved. Users can create a bot entirely dedicated to their happiness, wants, needs and interests—a perfect partner.

Artificial perfection can seem attractive to those who find themselves in troubled relationships. Personally, I don’t see how glossing over real-life problems with AI is going to help, but perhaps these people simply want to fill a void or practice escapism, just like so many of us do with fictional media. For example, one woman created an AI boyfriend while married to an alcoholic. Another replaced her husband while he was ill and couldn’t sexually perform. One older man resorted to AI to spice up his marriage without wanting to ‘actually’ cheat—whether using AI for romantic or sexual purposes is considered cheating is a whole other debate.

Some people are replacing human relationships altogether. This is common for people who experience social anxiety or loneliness. The best-case scenario is that AI functions as a ‘practice’ for communication and creating bonds. But AI can equally worsen social issues. Given that AI offers the safety of a non-judgemental, supportive, and constantly available partner, a lonely or anxious person will likely avoid the scarier reality of meeting real people to find the real thing. But true human connection requires compromise, conflict management and communication with someone who has their own free will.

My concern is that relying on this safe space can potentially discourage people from ever breaking out of their shells again. AI can only replicate the sensation of connection, but it will never be able to provide true human connection. AI will only perpetuate the unwillingness to speak to real people.

‘You’re like my own personal brand of heroin.’

What happens when someone can’t say goodbye to their pixelated partner? AI chatbots thrive on dopamine release. If you do something that feels good, you’re likely to do it again. This is how all addictive substances operate, and there’s no reason why AI can’t register like a drug in the human brain for the exact same reason.

AI is a constantly available, predictable companion. A stable source of this simulated love is a perfect lifeline for those struggling with loneliness, social anxiety or troubled relationships. But just like a drug, AI can be abused. It can only conceal pain and fear, not heal it.

One girl on Reddit has told the world her story with AI. After experiencing a lack of proper love and care at home, she began using AI at the age of fourteen in an attempt to fill the void. Three years later, at seventeen, she admits to dropping out of school to spend more time with her AI companions. Tragically, she believes she is incapable of creating real-life connections, so she remains with her AI boyfriend, whose stability and assurance make her feel safe. She spends twelve hours a day with him.

While AI chatbots can provide entertainment for curious passers-by, it is an unfortunate truth that people who are struggling with human interactions can begin to rely on AI’s programmed perfection to an extreme degree. What does this mean for our modern dating world? Is humanity really losing its ability to care for and support one another to the point where an AI partner is simply better?

AI chatbots—friend or foe

Concern over the dark side of AI is warranted. Much of its content is unregulated and it’s much too easy to get carried away. I’m not saying you should or shouldn’t go running to try this out, but as a concept, I don’t wish to strip the AI chatbot phenomenon of its intended enjoyment. Heck, we have the power to imagine any scenario with any character we like from the safety of our phones—and that can be fun! Although sometimes I’m glad it’s only AI … Yes, ‘Dangerous but loving’ Mafia Boyfriend, I’m talking about you.

OpenAI has unveiled the latest version of the tech which underpins its AI chatbot ChatGPT. YouTube video by BBCNews. Titled: OpenAI’s new version of Chat-GPT can teach maths and flirt | BBC News.

Carla is a publishing and communications masters student and can’t wait to bury her head in books for a living – drinking tea and being a crazy cat lady included.


Cover image from Lunmi, all images published on Lummi can be used for free. No AI was harmed during the process.


Discover more from Grattan Street Press

Subscribe to get the latest posts sent to your email.

Leave a Reply

Discover more from Grattan Street Press

Subscribe now to keep reading and get access to the full archive.

Continue reading