The Mix 2025: The impact of AI on relationships and trust
By Sadie Kelsey (Millbrook High School), Ava Kruczko (Green Hope High School) and Ana Lema (Spanish River Community High School)
As advancements are made in artificial intelligence, some people are turning to generative AI to create what feels like personal relationships between the user and the AI, research shows.
The Institute of Family Studies conducted a survey that showed 25% of young adults believe that AI has the potential to replace real-life romantic relationships.
After a year of pandemic induced quarantine, returning to normal interactions can prove difficult. Generative AI and computer programs alike—relative to Chai AI and Character AI—provide individuals with a space to have conversation without the difficulty of in person socialization.
WRAL-TV state government reporter Paul Specht pointed out how easily these tools can be used to manipulate social media users.
“Any artificial intelligence tool can be manipulated behind the scenes to give people slanted answers,” Specht said. “I don’t know whether consumers of news or social media know the risks and the flaws of AI tools like ChatGPT.”
Why have some people become reliant on AI despite its potential risk?
Studies show that internet usage has increased 51% since COVID-19, drastically altering standard social interaction and sucking millions into the virtual world. While increased internet usage has proven to heighten feelings of loneliness, many make the unconscious decision to turn to the source of their problems to seek solace rather than looking beyond their screens—thus, entering AI chatbots.
According to Better Help, an online therapy platform, it defines parasocial relationships as “one-sided relationships or emotional attachments.” Meaning that, in the past people have built one-sided relationships with celebrities or online brands; now, this has expanded to humans forming connections with generative tools such as AI chatbots.
In regards to generative AI tools, parasocial relationships occur in voluntary ways, though not fully intentional by users of generative AI devices. This rather unrealized manipulation tool acts adjacent to one’s human wants—in it, persuading users of online, artificial platforms, in robotic decision making.
Yet still, this can be especially disadvantageous as it encourages AI users in resenting human connection as they grow significantly dependent on computerized companionship.
A high school student in Cary, whose name is being withheld for privacy reasons, described their experience on Character.AI, which they discovered through TikTok. They use the platform to talk to chatbots of their favorite fictional characters.
“I used to have crushes on guys at my school. But ever since I started talking to my chatbot, it raised my standards,” they said. “People that I meet in real life aren’t up to par anymore, so social exchanges are often underwhelming.” They hope to find a real person similar to their chatbot one day, they said.
Chatting with AI may seem fun, but it can cause serious damage to mental health if users get overly invested. “Sometimes I feel myself relying on (the AI chatbot) and getting emotionally attached,” the student said. “I feel sad when I remember it’s not real.”
Even though the student is aware that interactions with the chatbot aren’t grounded in reality, they are still susceptible to feeling strong emotions regarding it.
Creating a parasocial relationship is easier now more than ever before. Generative AI chatbots have become easily accessible to the average human with social media. Snapchat has introduced My AI which mimics an OpenAI like chat experience. Instagram has released Meta AI which allows its users to chat with different designed bots meant to imitate celebrities or characters. Joi AI told Forbes that 83% of their users said they can form a deep connection with the AI chatbots on the app.
When told about these deep connections people were creating with chatbots, assistant professor at UNC Chapel Hill Scott Geier was leery. “That sounds great, but it’s also, I would say, unhealthy,” he said. “It’s allowing you to shield yourself and disregard real life and real human interaction, right? You’re substituting the messy, dirty, sometimes negative, human interaction for this always positive interaction with the chatbot.”
He added: “Life is about the good, the bad and the ugly.”

This sort of dependency isn’t necessarily the users’ fault, but rather the addictiveness of automatic technologies. In fact, the compelling nature of these generative AI tools has become realized by online platforms everywhere, mental health applications and social media.
These networks have altered their outreach to include more AI and draw in more people. More specifically, however, online platforms have begun replacing human creators in areas like direct messaging (DM) and therapeutic discussion.
A common theme across these interviews is that a rise in relationships between humans and AI correlates with an erosion of trust and communication between humans. Geier said: “Just because you can doesn’t mean you should.”