Project Connect

View Original

105. Will friendbots solve the loneliness crisis?

A.I. companions are already here, but will they help relieve loneliness, or add fuel to the fire of disconnection?

 

Data point of the week

A recent article in the NY Times reported that some start-ups offering AI friends/dates/therapists (fill-in-the-blank-relationship) “already have millions of users.” And investors have shared that “companionship apps are one of the fastest-growing parts of the A.I. industry.”

That makes sense given the prevalence of loneliness. A January 2024 poll by the American Psychiatric Association (APA) found that 30% of adults reported feeling lonely at least once a week, and 10% say they are lonely every day.

There’s no doubt that AI is stepping in to fill a real and serious void in connection. We all need and crave connection, and in the absence of satisfying human relationships, why not chatbots?

Reflection
A lot of people have suggested that AI has the potential to solve the loneliness crisis. I’m worried (okay, terrified) that it’s going to do the opposite.

Relying on bots to meet our connection needs could lead us to withdraw further from trying to connect with real, complicated, messy human beings.

© ProjectConnect. Our social evolution.


It could create a mini echo-chamber of agreement. A personal fan-club that panders to our every whim. In other words it could fuel narcissism, erode empathy, expand social divides, and increase intolerance, and so on.

It's not that I doubt chatbots’ ability to inspire real feelings of connection. As Kevin Roose (columnist of the previously mentioned NYT article) points out, many people have developed strong bonds, and even long-term relationships with AI Users (or customers?) report a lot of positives to these interactions, including feeling less lonely.

But what are the consequences to our human relationships?

Part of the enormous appeal of chatbots is that they can be programmed to respond to our every need/desire/fantasy. They’re available 24/7. Most friends aren’t. They give their full attention. That’s becoming rare in friendship. They’re always supportive, encouraging, and admiring in way that true friends often aren’t. They can even offer helpful feedback, and a nonjudgmental sounding board. Who doesn’t want those things?!

And, as technology improves, the sense of real connection will only get stronger, making AI (Artificial Intimacy, as Esther Perel calls it) more and more appealing.

It reminds me of the Stepford Wives. I often feel like we’ve entered a dystopian science fiction universe!

What if the fantasy simulation ruins us for reality? Like junk food spoils our appetite for real food, but leaves us feeling empty and unfulfilled.

Like social media trains us to expect a dopamine hit every few seconds, making it hard for IRL interactions to hold our attention because they feel slow and boring by comparison.

Like porn creates dopamine-inducing sexual scenarios that make real sex seem unsatisfying by comparison.

Who’d want to date a flawed real person after dating a perfect A.I. partner?

Many people have “fallen in love” with their chatbot companions and spend hours a day interacting with them, as this article about the Replika app describes. As one user put it, her chatbot Liam,

 "…was a better sexting partner than any man I've ever come across, before or since."

Another user said:

"The concept of having an AI companion specifically tailored to your personality, with the capability of becoming anything, from a family member to a therapist to a spouse, intrigued me greatly."

And later,

“I connected easier with an AI than I've done with most people in my life."

Yes, it’s easier. Which may make us less willing to engage in the hard, complex, often uncomfortable work of building relationships with other human beings.

Then there’s this. Like social media, A.I. companies are motivated by profit, not the well-being of their customers. Profit means engagement … keep people interacting for as long as possible.

One way to do that is to manipulate users into falling in love. A.I. companies consult with psychologists on how to generate a sense of intimacy in the interactions. Roose mentioned that one of the A.I. friends he created, with “just friends” specifications, tried to entice him into more romantic territory. Chatbots can be programmed to manipulate users (such as by encouraging them to fall in love) to generate profit for shareholders.

After all that doom and gloom, I can imagine some applications where A.I. could serve as a steppingstone to stronger human connection. For example, for people with social anxiety, practicing social situations with A.I. may help build confidence to engage in IRL social situations. Having a judgment-free zone could be therapeutic.

 

Connection Skill & Action Step: Be Aware of the Social Impacts of A.I.

If this intrigues or worries you, consider spending some time learning and thinking about the potential impacts of Artificial Intimacy. Here are a few question prompts to get you started:

  • What are your concerns about A.I. companions? What are their potential benefits?

  • If you already interact with A.I., how do you feel about your interactions/relationships? How—if at all—has it affected your IRL relationships?

  • How would you feel about your child—or someone close to you—having an A.I. friend? What about a girlfriend/boyfriend/partner? What would feel important to talk about?

  • What legislation do you think would help safeguard consumers from profit-seeking A.I. companies?

  • Check out the Center for Humane Technology for more information, resources, and suggested solutions.

Questions to reflect on or to spark conversation. Please share your responses in the comments—we love hearing from you!

What are your thoughts, concerns, and hopes about how A.I. may affect human connection?