close
close

MIT psychologist warns people not to fall in love with AI, saying it’s just pretending and doesn’t care about us.

MIT psychologist warns people not to fall in love with AI, saying it’s just pretending and doesn’t care about us.

We spend more and more time online—scrolling through videos, talking to people, playing games, and so on. For some, being online is an escape from the real world, and for many, the online world helps people socialize and connect. While people are increasingly connecting with their online space, the age of artificial intelligence is also pushing people into relationships with AI-powered chatbots that offer companionship, therapy, and even romantic relationships. While these interactions may initially relieve stress and seem harmless, these relationships are illusory and put people’s emotional health at risk, according to a new report by Sherry Turkle, a sociologist and psychologist at MIT.

Turkle, who has spent decades studying human-technology relationships, warns that while AI chatbots and virtual companions may seem to provide comfort and companionship, they lack genuine empathy and cannot reciprocate human emotions. Her latest research focuses on what she calls “artificial intimacy,” a term that describes the emotional bonds people form with AI chatbots.

In an interview with NPR’s Manoush Zomorodi, Turkle shared insights from her work, emphasizing the difference between real human empathy and the “fake empathy” that machines display. “I study machines that say, ‘I care about you, I love you, take care of me,'” Turkle explained. “The problem with that is that when we look for relationships without vulnerability, we forget that vulnerability is the real source of empathy. I call this fake empathy because the machine has no compassion for you. It doesn’t care about you.”

In her research, Turkle has documented numerous cases where people have formed deep emotional bonds with AI chatbots. One of these cases involves a man in a stable marriage who developed a romantic relationship with a chatbot “girlfriend.” Although he respected his wife, he felt a loss of sexual and romantic connection, which led him to seek emotional and sexual validation from the chatbot.

According to the man, the bot’s responses left him feeling validated and open, and finding a unique, judgment-free space to share his most intimate thoughts. While these interactions provided him with temporary emotional relief, Turkle argues that they can create unrealistic expectations of human relationships and undermine the importance of vulnerability and mutual empathy. “What AI can offer is a space away from the frictions of companionship and friendship,” she explained. “It offers the illusion of intimacy without the demands that come with it. And that’s the particular challenge of this technology.”

While AI chatbots can be helpful in certain scenarios, such as reducing hurdles in mental health treatment and providing reminders to take medication, it’s important to note that the technology is still in its early stages. Critics have also raised concerns about the potential for harmful advice from therapy bots and significant privacy issues. Mozilla’s research found that thousands of trackers collect data about users’ private thoughts, with little control over how that data is used or shared with third parties.

For those considering a deeper exploration of AI, Turkle offers some important advice. She stresses the importance of appreciating the challenging aspects of human relationships, such as stress, friction, headwinds and vulnerability, as they allow us to experience a full range of emotions and connect on a deeper level. “Avatars can make you feel like (human relationships) are just too stressful,” Turkle said. “But stress, friction, headwinds and vulnerability allow us to experience a full range of emotions. That’s what makes us human.”

We know that the rise of “artificial intimacy” presents a unique challenge as we navigate our relationships in a world increasingly intertwined with AI. While AI chatbots can provide companionship and support, Turkle’s latest research underscores the need to approach these relationships with caution and a clear understanding of their limits. As she succinctly puts it: “The avatar stands between the person and a fantasy,” she said. “Don’t get so attached to it that you can’t say, ‘You know what? This is a program.’ There’s no one home.”

Published by:

Divya Bhati

Published on:

July 6, 2024