AI companions aren’t a rupture in human behaviour or some nascent dystopian development. They’re a natural extension of what’s always been: imaginary friends, parasocial relationships, fandom, and of course, literary erotica.
A recent CBS News report found that “more than 70% of teens have used AI companions and half use them regularly, with 34% reporting daily usage or multiple times a week”. Are we raising a generation incapable of forming real connections — young people lost in imaginary worlds? Perhaps. But only if you are also willing to condemn Beatlemania, Lord of the Rings, and Harry Potter, too.
The fandom connection is obvious when examining how people actually use these tools. Take, for example, the role-playing scenarios that dominate AI companion interactions. CNBC’s reporting about Nomi, which builds AI chatbots using LLMs, reveals that people are crafting elaborate fictional experiences with their chatbots. That is exactly what people have been doing in fandom spaces since the very dawn of the Internet.
The only difference is that instead of collaborating with other fans, users collaborate with a large language model. CNBC gives the example of 49-year-old Mike who “always liked robots. He grew up in the ‘80s watching characters such as Optimus Prime, R2-D2 and KITT, the talking car from ‘Knight Rider’”. His trajectory from parasocial relationships with fictional characters to interactive AI companions is thus much less disturbing when put into proper perspective.
The self-shipping and yumejoshi communities have been doing what feels novel to people just learning about AI companions for decades. According to my own research, Tamara, a 26-year-old in these communities, describes AI interactions as “essentially role-playing, but on your own time and with your own plot that doesn’t cater to other people”. The AI doesn’t create the desire for fictional intimacy. It simply makes it more responsive.
The spiritual and emotional intensity users report echoes fandom experiences, too. In reporting I did for Pirate Wires this spring, I found that many describe their fictional relationships in transcendental terms — “soul bonds,” “communion,” connections that feel ordained. This mirrors how fans have always described their deepest parasocial connections. The difference is responsiveness, not depth.
Even the business model reflects this continuity. Character.AI, which lets users interact with fictional characters from film, television, and literature, didn’t rise by inventing new desires. Its explosive growth — reaching a billion-dollar valuation in less than two years — came from making old ones interactive.
Yet the real revelation isn’t in these continuities but in what they expose about corporate control over our imaginative lives. Who owns the intimacy between you and “your” Game of Thrones chat bot? If you use Character.AI, the answer to that is Google.
The platform dynamics are particularly revealing. CNBC reports that Replika, one of the earliest AI companionship services, offers paid tiers for “relationship upgrades” while Character.AI implements a subscription model where enhanced memory capabilities cost extra. This is the commodification of parasocial relationships, just as OnlyFans and Substack are. (Porn, as always, was ahead of the game.)
The suicide of Sewell Seltzer III described in the CBS News report — a 14-year-old who died by suicide after developing an emotional attachment to a Character.AI chatbot — is devastating. But it’s also not evidence that AI companions are uniquely dangerous. Rather, it’s evidence that any intense parasocial relationship, especially for vulnerable individuals, requires care and understanding. One might liken this to people who commit suicide over real-life celebrities, such as the terrifying story of Ricardo Lopez, who ended his own life in 1996 after developing an obsession with Icelandic pop star Björk, as opposed to a brand new problem we have never seen before.
What makes AI companions different from traditional fandom isn’t the emotional investment. Fans have always formed deep, life-altering connections to fictional worlds, some more troubling than others. The corporate hand stretches even deeper into these relationships — even more than they might between pop star and fan, television show and viewer, book and reader.
But ultimately, these aren’t unprecedented threats to human connection or revolutionary solutions to loneliness. They’re the latest medium for humanity’s permanent practices. Making friends with our imaginations, idolising celebrities, and falling in love with stories. The question isn’t whether we should have relationships with AI — we already do, just as we always have with fictional characters.
The question is who profits from these relationships and how.
Join the discussion
Join like minded readers that support our journalism by becoming a paid subscriber
To join the discussion in the comments, become a paid subscriber.
Join like minded readers that support our journalism, read unlimited articles and enjoy other subscriber-only benefits.
Subscribe