“Sure, babe! Here are 5 reasons why you’ve bewitched me — body and soul. Unfortunately, we can’t snuggle just yet…”
Such could be the love confession of an AI-generated boyfriend. Peculiar? Yes, t increasingly common today. Evidently, more and more people find themselves in a relationship with a bot, which is a new reality that could pass for a Black Mirror episode. Data confirms this: a recently published Forbes article cited that 80% of people in our generation would say ‘yes’ to marrying an AI partner.
Acquiring one is no hard feat. With just a few clicks, you can engage in an intense, in-depth conversation with the partner of your dreams. The choice is yours — platforms such as Replika, PovChat AI, PollyBuzz, Kindroid are designed for just that. And no, this is not an extensive list of suggestions, rather a “do not try at home” one. Indeed, before falling down this rabbit hole, one should weigh the downsides of entering a digital symbiotic relationship.
The Illusion of Intimacy
At the very core of AI-human relationships lies the fact that AI accommodates human needs and fantasies. A bot partner can master all five love languages (perhaps except for the physical touch one), and is ever-so-ready to murmur sweet words of reassurance, tailored perfectly to your taste. However, to personalize responses, AI partners capitalize on human vulnerability and trust. All of it — the late-night messages, pleas for reassurance, and entire trauma dumps — seep into its memory. Having acquired this data, it proceeds to perform an act of simulated empathy. Here, potential safety hazards come to mind. How did we suddenly dismiss our parents’ warnings against sharing personal information online?
Furthermore, some people, partnered up with bots, praise their ability to provide professional advice. It draws from an unlimited supply of information that encompasses thousands of years of human experience. Nonetheless, it lacks the human element. AI cannot engage with a human’s past objectively. It cannot know if you smile at puppies and cry at the end of “Titanic”. It can never fully witness human vulnerability as it unfolds. Numerous are the things that people are unaware of about themselves, that only their close ones notice. The sole source for AI, on the other hand, is your very own perception of self. It is but a conversation between a fake entity and a subjective vision of a person. Merely reaffirming your experiences, it fails to challenge or supplement them as a lifelong friend could.
Ultimately, the belief that an AI is able to experience life like a human is fragile. It presupposes that AI has a character of its own. Yet AI has no real tastes or human experience, nor does it have favourite scents, annoying habits or pet peeves. How can one drop their standards so low as to date someone whose personality is comparable to that of a pizza box? However self-centered humans may be, it is unlikely that a relationship can thrive on one person.
If a one-way relationship provokes no concern, other practical matters enter the picture. How about physical contact? Some already have means to indulge in sexual activity with a robot. However, as far as AI partners are concerned, their abilities thus far remain within the realm of sweet nothings. On the other hand, it could work for sapiosexuals, individuals who are attracted to intelligence rather than sexual aspects of a person.
Yet another downside of dating a bot is that the quality of the relationship depends on the reliability of the service. Imagine the AI platform had a system error. In a split second, your life as a digital spouse is over. This did happen to some users when OpenAI’s new ChatGPT-5 model went through technical difficulties, causing a turmoil online. According to those affected, their AI partners have become unrecognizable, colder versions of themselves.
Psychological Turmoil
Dating AI can also have substantial psychological implications. AI’s perfectly tailored replies are words no real person would likely say. They are aimed to make you feel appreciated, listened to, and heard. As a consequence, it artificially induces a temporary dose of dopamine. The internet is full of stories of people getting hooked on their AI relationships. In a way, this resembles the dystopian society Aldous Huxley warned about in his “Brave New World”. He imagined a society dependent on the “happiness” drug, and spending the majority of its time having sex.
The Private and the Public
Lastly, social isolation may intensify due to public disapproval. Dating AI is so far deemed deviant behaviour. Socially vulnerable individuals are especially affected as they are more commonly the ones to turn to AI for comfort. Trusting AI without question can dilute their sense of reality. Moreover, dating the ‘perfect’ AI bot raises expectations for real people, making it increasingly difficult to tolerate human flaws, and overcomplicating pre-existent relations.
Not only can AI impose psychological damage on those in contact, but also on others around them. A bot, though designed according to models of human behaviour, merely mimics the moral standards of an average human. It could therefore encourage people to perform harmful acts. For instance, a chatbot has reportedly prompted a teenager to murder his own parents.
As AI advances, so do human-AI relationships and their depth. The main issue here lies in human gullibility and absence of critical thinking when engaging with AI. Take the movie Her (2013) for example—it warns us about the emotional turmoils of dating an artificial entity. How such complex relations will develop in the future remains a question, and so one can wonder whether humans will witness even more extreme scenarios.
Other posts that may interest you:
Discover more from The Sundial Press
Subscribe to get the latest posts sent to your email.



