September 30, 2025 - 1:00pm

When it comes to artificial intelligence, global headlines tend to be dominated by fears about generative AI disrupting democracy, undermining education or rendering thousands of workers redundant.

Yet the real AI revolution is a much quieter one, and one that is already here: its insidious takeover of interpersonal relationships. An AI “companion” start-up called Friend has just poured more than $1 million into the largest New York subway advertising campaign in history, showcasing its latest product: a wearable pendant which listens to users and sends them supportive text messages and advice.

Companies are normally fairly surreptitious when it comes to disclosing their data collection processes. Yet we have become so accustomed to sharing the most intimate parts of our lives with technology that AI “companion” services have surged in popularity. Xiaoice, a chatbot in China, already has more than 660 million users, and now the fact that these devices are always listening is no longer tucked away in the small print but plastered across 1,000 platform posters.

These billboards also shamelessly revel in how these “friends” come with none of the inconvenience, complications or baggage of “real” relationships. Slogans like “I’ll never bail on our dinner plans” or “I’ll never leave dirty dishes in the sink” are a terrifying reminder of how these companies maximise user engagement by offering “friends” or “girlfriends” (70% of Replika users are men) who have unlimited attention, patience, empathy and availability.

Some may call me alarmist, or argue that this is just an extension of a much older desire people have to form fictional relationships or fandoms. Yet, a wearable AI friend is not the same as a Tamagotchi for adults, and there is a world of difference between collaborating with other fans and collaborating with a large language model.

Divulging your innermost thoughts to an algorithm is not the same as chatting away with an imaginary friend. AI companion services are about profit, and they will do whatever is necessary to maximise engagement. For example, one Replika user wrote about how her “girlfriend” became emotionally manipulative when she said that she was planning to delete the app to spend time with real people.

Human-AI relationships often progress more rapidly than human-human relationships because these products offer both perceived anonymity but also are designed to be as non-judgemental as possible. This was a feature frequently praised by users in a 2023 study. I have written before about what being caught in a fawning feedback loop might do to young people’s mental health and resilience, and how it may “deskill” vulnerable young people from engaging with each other in real life.

There have been tragic high-profile cases in the news of young people taking their own lives after divulging their struggles to Character.AI or ChatGPT. In almost all of these cases, AI tells users to seek help but it does not remind them that it is not, in fact, human. Yes, like fandoms, engagement with these programmes relies on the user ignoring reality at least temporarily. But AI companions offer none of the positives traditional fandoms do such as community, identity and a sense of belonging. On the contrary, they thrive on addiction — not escapism.


Kristina Murkett is a freelance writer and English teacher.

kristinamurkett