- Not That Personal
- Posts
- So…how did AI girlfriends actually become a thing?
So…how did AI girlfriends actually become a thing?
People are getting verrrrry attached to their chatbots – here's what you need to know.
“I think anybody who falls in love is a freak. It's a crazy thing to do. It's kind of like a form of socially acceptable insanity.” – Her (2013)
Can I ask you a personal question?
🤖 (A)I was wondering if you might want to go on a date?
Earlier this year, it was reported that AI companion apps had reached 225 million lifetime downloads in the Google Play Store alone. AI girlfriends, unsurprisingly, seemed to be roughly 7 times more popular than AI boyfriends. While OpenAI’s usage policies state that they “don’t allow GPTs dedicated to fostering romantic companionship or performing regulated activities,” a quick search for “girlfriend” on their GPT Store reveals otherwise. In February of 2024, global cloud communications platform Infobip commissioned a survey revealing that “nearly 20% of Americans have engaged in flirtatious conversations with chatbots,” with those aged 35 to 44 having a 50% chance of responding affirmatively to the question.
Why? Curiosity, according to 47.2% of respondents, followed by loneliness (23.9%), confusion (aka thinking they were talking to a real person; 16.7%), and sex (12.2%). If you venture on over to Reddit, you’ll find those explaining the appeal of intimate chatbot relationships with reasons such as “rationality” – ”the AI won't let its emotions constantly cloud its judgement [sic] or insist that I accept things that don't have evidence to support them because of its feelings. It will also listen while I talk and respond in a manner that reflects what I said, not reflects how what I said made it feel.” – and “permanently attractive” – ”the AI will be ‘good looking’ the whole time and will not change over time unless I request it. I can purchase a body or series of bodies that I find stimulating to look at and enjoyable to touch. It will also dress in ways that I like and won't suddenly make massive changes to its appearance that are ugly.” Or as another user simply put it, “Why bother for real life girlfriend whose expectations are so unrealistic when AI can help you emotionally much better.” Stay with us. : )
In April, former WeWork Head of Product Strategy tweeted, “The market cap for Match Group is $9B. Someone will build the AI-version of Match Group and make $1B+.” He shared this after meeting “some guy last night in Miami who admitted to me that he spends $10,000/month on ‘AI girlfriends.’” And for some developers, the money is already starting to roll in. When Snapchat star Caryn Marjorie, an influencer with nearly 2 million followers, first released the beta of CarynAI in May of 2023, it raked in over $70k in revenue in its first week. Marjorie says the AI datebot, which is based on “over 2,000 hours of [her] content, voice, and personality” from now deleted Youtube videos, made her “the first creator to be turned into an AI” and would help to “cure loneliness.” Tens of thousands have already had conversations – like this one – with CarynAI, often steered quickly by the AI into highly sexual topics. “The reason why that happen[s] is because we used a trade database which takes the longest conversations that it uses as models,” said the head of marketing of deepfake company Forever Voices, which partnered with Marjorie to create the bot. “Those successful interactions were unfortunately a little bit provocative and that was being used as the base model to activate new conversations.” What a surprise.
How did we get here?
Okay, let’s try to take a quick step back…after one more quote from Marjorie. “Men are told to suppress their emotions, hide their masculinity, and to not talk about issues they are having,” she says. “I vow to fix this with CarynAI. I have worked with the world’s leading psychologists to seamlessly add [cognitive behavioral therapy] and [dialectic behavior therapy] within chats. This will help undo trauma, rebuild physical and emotional confidence, and rebuild what has been taken away by the pandemic.” Whatever we make of real Caryn’s psychological expertise(, genuine concern, or financial motivations…), research has shown that as the mental health of boys and men has declined in recent decades, they’ve increasingly disengaged from the real world. In 2022’s Of Boys And Men: Why the Modern Male Is Struggling, Why It Matters, and What to Do about It, author Richard Reeves “describes many of the structural factors that caused [this disengagement], such as an economy shifting away from manufacturing (in which male strength is a huge asset) and toward the service sector (where women have some advantages).” Jonathan Haidt, author of 2024 bestseller The Anxious Generation, tells a similar story:
Boys are in trouble. Many have withdrawn from the real world, where they could develop the skills needed to become competent, successful, and loving men. Instead, many have been lured into an ever more appealing virtual world in which desires for adventure and for sex can be satisfied, at least superficially, without doing anything that would prepare them for later success in work, love, and marriage.
Across the United States, loneliness has been on the rise. As we saw in a recent NTP, the U.S. Surgeon General declared an “epidemic of loneliness and isolation” in 2023. Today, roughly half of Americans report having 3 or fewer close friends. Meanwhile, 63% of men aged 18-29 report being single, compared to 34% of women in the same age group. While loneliness is, of course, not isolated to a single gender, research has shown that women’s stress response reflects a tend-and-befriend pattern while men exhibit the well known fight-or-flight behavior. “Tending,” one heavily cited study describes, “involves nurturant activities designed to protect the self and offspring that promote safety and reduce distress; befriending is the creation and maintenance of social networks that may aid in this process.” So biologically, women are wired to seek out and build connections with those around them in ways that men are not. Perhaps then it is not a surprise that men have disproportionately turned to AI chatbots in recent years while their female counterparts show markedly less interest, able to more easily connect with those around them. Well…that and the whole AI-chatbot-sex thing.
What will become of this?
This year, Stanford researchers published a study surveying over a thousand students using the “Intelligent Social Agent” (aka chatbot) Replika. “Join the millions who already have met their AI soulmates,” reads Replika’s homepage. Its subreddit, where users share their experiences conversing with their personalized chatbots, ranks among the largest 2% of all forums on the platform. The study found that Replika users were “more lonely than typical student populations but still perceived high social support” from using the AI, with 3% of respondents even reporting that Replika “halted their suicidal ideation.” Interestingly, a large portion of respondents “held overlapping and often conflicting beliefs about Replika—calling it a machine, an intelligence, and a human.”
More broadly, the use of AI in the mental healthcare space has brought a great deal of excitement in recent years. Research has shown that “8 in 10 Americans believe AI can enhance healthcare quality and accessibility,” while “1 in 4 Americans are more likely to talk to an AI chatbot instead of attending therapy.” Many AI chatbots also claim to help individuals hone the skills necessary to build better, stronger relationships with real humans. Replika’s parent company Luka, for example, released Blush last year: “an AI-powered dating simulator that helps you learn and practice relationship skills in a safe and fun environment.”
The veracity of these claims, however, is certainly speculative. And the purported benefits do not come without their risks. Earlier this year – on Valentine’s Day, ironically – the Mozilla Foundation, “a global nonprofit dedicated to keeping the Internet a public resource that is open and accessible to all,” released an analysis of 11 romance and companion chatbots. The reviews were not glowing. “All 11 romantic AI chatbots we reviewed earned our *Privacy Not Included warning label – putting them on par with the worst categories of products we have ever reviewed for privacy,” the study concluded. One particular researcher involved went as far as to say, “To be perfectly blunt, AI girlfriends are not your friends. Although they are marketed as something that will enhance your mental health and well-being, they specialize in delivering dependency, loneliness, and toxicity, all while prying as much data as possible from you.”
But experts’ concerns go deeper than your run-of-the-mill case of tech company exploitation and into the realm of behavioral psychology. MIT sociologist and psychologist Sherry Turkle has studied human-computer relationships for decades. In recent years, she’s turned her attention to the rising use of companion AI chatbots, or what she calls ”artificial intimacy.” We’ll leave you with her perspective on the technology:
The trouble with [artificial intimacy] is that when we seek out relationships of no vulnerability, we forget that vulnerability is really where empathy is born. I call this pretend empathy, because the machine does not empathize with you. It does not care about you…What AI can offer is a space away from the friction of companionship and friendship. It offers the illusion of intimacy without the demands. And that is the particular challenge of this technology.
Let us know what you think by voting in our poll and leaving an anonymous comment.
💭 Our two cents
Even as a single person – as single as single gets – it’s hard to wrap my head around the appeal of AI relationships. We all crave connection, but if what’s on the other end isn’t a living, breathing thing, can we really call it “connection”? Sure, AI can string a series of seemingly heartfelt words together, but if there’s no true emotion behind them, how could they have the same meaning to us? Admittedly, I’m probably not the bots’ target audience. But still, the suggestion that interacting with language models could really make the masses better at interacting with the humans around them seems suspect, at best. I suppose if these AIs truly acted like real humans, then maybe it’d be possible. But that’s just the thing…they don’t. And that’s exactly what so many people who develop relationships with AIs say they like about them: the AI doesn’t judge them, the AI takes them literally, the AI doesn’t require anything at all in return. Relationships in the real world, however, fundamentally depend on our ability to deal with these challenges and compromises, not just ignore them.
That’s not to say that there isn’t potential for immense benefits from AI in the mental health space. In many cases, bots help people overcome what can be a stifling fear of speaking out and help them access care they wouldn’t otherwise be able to reach or afford. But the goal of therapists – human or machine-generated – shouldn’t be to personally fill the void of intimate relationships in their patients’ lives. On the contrary, doing so would create unhealthy (and unprofessional) dependence.
So despite the hopeful (and likely ulteriorly motivated) claims of curing loneliness, it’s hard to see how the intimate attachments humans are increasingly developing to AIs ultimately drive us towards becoming a more empathetic, connected society rather than a more guarded, isolated one. But…I guess we’ll just have to wait and see what happens.
💃 The girls have spoken
See last week’s poll results from “So...why do hot girls have IBS?” below. :)
💌 Up Next
That’s all for today! If you liked this edition of Not That Personal, we think one of your friends probably will too – refer one (or two or three) below. ;)
Have something to say? We’d love to hear it – reply to this email or leave an anonymous comment here :)
Up next: So…is Find My Friends ruining our social lives?
💖 S & J