Humans are very complex creatures, and one of the fun challenges of being a sentient being is the attachments that we form, to other people, to animals, to objects, and now, it seems like science has proven we can also get attached to chatbots and other artificial intelligence creations. When the movie Her came out in 2013, many people struggled to see how a person could get attached to a computer voice, but now, just over a decade later, it is happening at a mass scale. This prompted psychologists Fan Yang and Atsushi Oshio to consider the trend and create a study in order to answer the question once and for all: can Artificial Intelligence actually become an emotional attachment figure?
Since we are now more isolated than ever, many of us have taken to using our virtual assistants and Artificial Intelligence systems like ChatGPT as sounding boards for our ideas. The new study discovered that, far from this being the only use, over 75% of the participants in the research said they turn to Artificial Intelligence when they are upset, confused, or just need someone to talk to. While this is still on the realm of sounding board use, more than half also admitted they actively seek closeness with these systems, which is troubling.
Why humans can form emotional attachments with Artificial Intelligence
Well, the first thing to contemplate is how we define attachment and connection. Attachment theory outlines three core functions of an attachment figure: they offer closeness, act as a safe haven in tough times, and give us a base from which we explore the world. The research found that Artificial Intelligence, while not human, can still check these boxes in its own way.
“As researchers in attachment and social psychology, we have long been interested in how people form emotional bonds,” says research associate Fan Yang, a PhD student in psychology. “In recent years, generative AI such as ChatGPT has become increasingly stronger and wiser, offering not only informational support but also a sense of security. These characteristics resemble what attachment theory describes as the basis for forming secure relationships. As people begin to interact with AI not just for problem-solving or learning, but also for emotional support and companionship, their emotional connection or security experience with AI demands attention.”
In a pilot test, participants were asked if they saw the Artificial Intelligence as an attachment figure. Surprisingly, 52% felt drawn to it emotionally, and 77% used it as a go-to in emotionally charged moments. For many, Artificial Intelligence had become the place to turn when they were feeling low or had something personal to share, and this is because, unlike humans, Artificial Intelligence will not ghost you, judge you or betray your trust, it has 24/7 availability, personalized responses, and a friendly tone can still feel deeply supportive even if it is neutral.
The researchers also built a tool called the Experiences in Human-AI Relationships Scale (EHARS) to measure how people relate emotionally to Artificial Intelligence. It focuses on two major attachment styles: anxiety and avoidance and they found that those who score high on anxiety tend to use Artificial Intelligence a lot, seeking out comfort and emotional validation. You might hear them saying things like “I wish the AI would show me affection” or “I need reassurance.” On the other hand, folks with avoidant styles are more skeptical, do not share much, and generally keep a distance from these tools.
The study also points out that none of this is necessarily unhealthy, as humans are wired to project feelings onto things, and we have always done it, but the rise of Artificial Intelligence introduces a new layer that needs understanding, especially if these interactions are becoming emotional mainstays for people.
