We are investing more and more time online—scrolling through videos, talking to people, playing games, and so on. For some, being online provides an escape from the real world, and for many, the online world helps them socialise and connect. While humans are increasingly getting connected with their online space, this age of AI is also driving people towards relationships with AI-driven chatbots, offering companionship, therapy, and even romantic engagement. While at first, these interactions might provide stress relief and seem harmless, according to a new report by Sherry Turkle, an MIT sociologist and psychologist, these relationships are illusory and risk people’s emotional health.
Turkle, who has dedicated decades to studying the relationships between humans and technology, cautions that while AI chatbots and virtual companions may appear to offer comfort and companionship, they lack genuine empathy and cannot reciprocate human emotions. Her latest research focuses on what she calls “artificial intimacy,” a term describing the emotional bonds people form with AI chatbots.
In an interview with NPR’s Manoush Zomorodi, Turkle shared insights from her work, emphasising the difference between real human empathy and the “pretend empathy” exhibited by machines. “I study machines that say, ‘I care about you, I love you, take care of me,'” Turkle explained. “The trouble with this is that when we seek out relationships with no vulnerability, we forget that vulnerability is really where empathy is born. I call this pretend empathy because the machine does not empathise with you. It does not care about you.”
In her research, Turkle has documented numerous cases where individuals have formed deep emotional connections with AI chatbots. One such case involves a man in a stable marriage who developed a romantic relationship with a chatbot “girlfriend.” Despite respecting his wife, he felt a loss of sexual and romantic connection, leading him to seek emotional and sexual validation from the chatbot.
According to the man, the bot’s responses made him feel affirmed and open, and he found a unique, judgement-free space to share his most intimate thoughts. While these interactions provided temporary emotional relief, Turkle argues that they can set unrealistic expectations for human relationships and undermine the importance of vulnerability and mutual empathy. “What AI can offer is a space away from the friction of companionship and friendship,” she explained. “It offers the illusion of intimacy without the demands. And that is the particular challenge of this technology.”
While AI chatbots can be helpful in certain scenarios, such as reducing barriers to mental health treatment and offering reminders for medication, it is important to note that the technology is still in its early stages. Critics have also raised concerns about the potential for harmful advice from therapy bots and significant privacy issues. Mozilla’s research found that thousands of trackers collect data about users’ private thoughts, with little control over how this data is used or shared with third parties.
For those considering engaging with AI in a more intimate way, Turkle offers some important advice. She emphasises the importance of valuing the challenging aspects of human relationships, such as stress, friction, pushback, and vulnerability, as they allow us to experience a full range of emotions and connect on a deeper level. “Avatars can make you feel that [human relationships are] just too much stress,” Turkle reflected. “But stress, friction, pushback, and vulnerability are what allow us to experience a full range of emotions. It’s what makes us human.”
We know that the rise of “artificial intimacy” poses a unique challenge as we navigate our relationships in a world increasingly intertwined with AI. While AI chatbots can provide companionship and support, Turkle’s latest research highlights the need to approach these relationships with caution and a clear understanding of their limitations. As she succinctly puts it, “The avatar is betwixt the person and a fantasy,” she said. “Don’t get so attached that you can’t say, ‘You know what? This is a program.’ There is nobody home.”
Published By:
Divya Bhati
Published On:
Jul 6, 2024