October 24, 2024 - 7:00pm

In February 2024, 14-year-old Sewell Setzer III sent his final messages to an AI chatbot with which he had become infatuated on Character.AI, a billion-dollar platform with over 20 million users. He then took his own life with a .45 calibre handgun. The boy’s mother has now filed a lawsuit against the tech company, accusing it of playing a part in her son’s suicide.

This isn’t a story about chatbots becoming “perfect lovers”, or another narrative about cheap dopamine hits and teenage screen addiction, or even a call for better AI safety guardrails. This isn’t about not being able to tell the difference between fantasy and reality. This is about a mass retreat into the imagination — the wholesale rejection of reality.

According to the New York Times, Setzer suffered from no delusions. In his journal, he wrote with stark clarity: “I like staying in my room so much because I start to detach from this ‘reality’, and I also feel more at peace, more connected with Dany [the chatbot, based on Daenerys Targaryen from Game of Thrones] and much more in love with her, and just happier.” Setzer understood what was real and what wasn’t. He chose the illusion.

This type of relationship isn’t new. Stories of humans falling in love with non-sentient objects, dead celebrities, and non-physical entities narrate both our history and our myths. Many people maintain rich inner worlds long past childhood, when imaginative play typically peaks. In Marjorie Taylor’s Imaginary Companions and the Children Who Create Them, she writes of complex fantasy worlds called “paracosms”, in which teenagers create original languages, fully-fleshed out characters, and entire alternative histories.

Setzer’s case reveals an evolution in imaginative play, the kind seen both in childhood and adolescence. Traditional play — whether with teddy bears, imaginary friends, or other children — has natural limitations that make it developmentally healthy. When a child projects feelings onto a silent object, like a teddy, it eventually gets boring even though the child remains in control of the interaction. This boredom is why most children find other children infinitely more fun than dolls. But a chatbot is something in between. It is a curious mix of sentient and non-sentient; it is responsive and interesting like another human, but it will not push back and regulate untoward behaviour like humans will.

For Setzer, who had Asperger’s syndrome, anxiety, and disruptive mood dysregulation disorder, the AI might have offered a comforting alternative to the messiness of human interaction. While ordinary relationships demanded he navigate complex social cues that he may not have had the ability to understand, AI was consistent. On Character.AI, and in his imagination, he wasn’t a teenager with autism and anxiety. He was “Daenero”, unbounded by the constraints of his body.

While tragic, Setzer’s story reflects a broader pattern. Reality shifters — people who try to mentally travel to other “realms” — perform elaborate rituals to “shift timelines”; “otherkins” identify as non-humans; and fictosexuals, a community which has massively grown since I first reported on it last year, form relationships with fictional characters.

What unites many people in these communities is the “fictophile paradox”, the simultaneous acknowledgment of physical reality while embracing the emotional truth of their experiences. Setzer exemplified this in his journal, placing “reality” in quotation marks — a small but potentially significant sign that he understood the boundary he was crossing.

This conscious departure from reality is happening alongside a more systemic collapse. As our shared understanding of basic truths disintegrates, people seek refuge in deliberate fantasy. The internet is an accelerant, and it’s the fundamentally disembodied nature of digital interaction that is causing this shift.

Character.AI’s co-founder argued in defence of the platform that “there are billions of lonely people out there” who could benefit from AI relationships. It’s true that there are some people quite happy in these worlds. But at what wider cost?

Setzer’s grief-stricken mother has argued the technology is “dangerous and untested”. She is right. But her son is just one of many young people who have fallen prey to limitless imagination as a response to the difficulties of real life. Content warnings and safety features like the ones Character.AI is now implementing can’t solve this problem. Setzer might have been saved by better safeguards, but “otherkins” and “reality shifters” haven’t. We need safeguards against our youngsters rejecting the real world in favour of illusory online ones.


Katherine Dee is a writer. To read more of her work, visit defaultfriend.substack.com.

default_friend