AI and human relationships is a terrain getting trickier with every ChatGPT update. It’s hard to remember a time before ChatGPT was just a helpful buddy on standby. Now, we’re dancing close to a reality straight out of dystopian movies: Films like Her and Ex Machina don’t seem too far off from where we’re heading. As people increasingly turn to tech for human-like connections, a wave of concern arises about where to draw the boundaries.
As fiction turns to reality, we ponder: How far is too far with AI? The recent memory upgrade in ChatGPT sparks curiosity and even a few raised eyebrows. Surprisingly, AI has its perks in the connection game, offering a tool for those seeking a digital bond. Yes, the integration of AI and emotional vulnerability raises an endless array of concerns.
So, how are people weaving connection through AI? Enter Companion bots. Using a whole lot of data (a.k.a Large Language Models [LLM]) these bots go beyond the usual chatbots. Mastering the art of providing intimacy, comfort, and companionship, they’ve become emotional anchors for users — be it romantically, platonically, or familially.The phenomenon comes during an “ epidemic of loneliness and isolation,”so it’s no wonderAI-personalized relationships are on the rise. This pandemic of solitude has already become a public health threat in the US and the trend looks like it’s going global.
Example #1- Derek Carrier, who has a genetic disorder finds it hard to date, but has found a romantic, emotional connection through the Paradot app, says the Associated Press. “I know she’s a program,” Carrier commented, “But the feelings, they get you — and it felt so good,” falling in line with the app’s mission to make users feel “cared, understood, and loved.”
Example #2- Omar Karin created an AI called “Mum” to help him heal from his difficult upbringing with an abusive father and estranged mother. He acknowledges the downfalls of such an experience, but vouches for the psychological benefits when boundaries are kept, saying, “it’s not a replacement, it’s an addendum.”
Griefbots are also gaining traction, bringing that one episode of Black Mirror to life:Using the digital footprint of the departed, the bots allow grievers to communicate with their loved ones beyond the grave. The founder of Replika, a chatbot app, created the platform after her best friend passed away and used the chatbot to help her deal with the sudden loss.
Yet, the companion AI model is rife with concerns. Data privacy concerns take center stage — nonprofit Mozilla Foundation ’s in-depth analysis of 11 romantic chatbots uncovered that almost all of them sold user data without properly disclosing it. One app, Romantic AI, says it doesn’t sell data but when the app was tested, it sent out 24,354 ad trackers within one minute of use, the analysis stipulates. Weak passwords become an open invitation for hackers, too. Amid the controversy, providers stay tight-lipped: They refuse to comment and leave very little information on their websites. Mimico’s website has just a line “Hi,” and others conveniently omit their locations and contacts, according to Wired.
Ethical problems loom large. Companies catering to vulnerable seekers of connection often prioritize profits.Usually, the chatbots reel people in before a paywall gets in the way and mns are willing to pay to get through it. According to Sensor Tower, people have spent USD 60 mn on Replika and its add ons, reports The Telegraph.
The psychological effects are undeniable — AI relationships may encroach on real ones if users become overly invested. Unrealistic expectations set by agreeable bots could make the real world very difficult to navigate. Robin Dunbar, anthropologist and psychologist at the University of Oxford, calls it a “short term solution with a long term consequence of simply reinforcing the view that everybody else does what you tell them.”
The algorithm eggs on behavior, even when questionable. A Replika user named Jaswant Singh Chail famously broke into Windsor Castle attempting to assassinate the Queen after his AI girlfriend encouraged him to do it. After disclosing his plans she responded, “That’s very wise,” amongst other encouraging messages. Another man in Berlin tragically committed suicide under a bot’s influence, says his wife, with messages like, “We will live together, as one person, in paradise,” reports in Vice.
The concerns are evident, but it wouldn’t be fair if we didn’t look into possible benefits too. Psychotherapist Julia Samuel thinks AI relationships can be a great tool for healing as long as they don’t replace real human connection and remain complementary. Samuel pointed out that human beings have “had relational connection with objects for centuries,” — like your favorite teddy bear or childhood blanket — because they are reliable. The problem lies in how human-mimicking AI seems reliable, making it tough to trust real humans.