Another AI debate arises: Therapy via chatbot could be a thing, according to the BBC. In 2013, Joaquin Phoenix fell in love with Her, aka Samantha, a virtual assistant much like Siri and Alexa. But what was once a far-fetched sci-fi movie is now looking much more realistic as intimacy and even trust between humans and artificial intelligence systems is not only at the doorstep, but deeply embedded within several people’s lives.
What’s the issue? Over 1 bn people around the world suffer from mental disorders, according to the World Health Organization. And yet, not everyone has access to help. Whether this is due to social stigma, the hefty price of sessions, or other hindrances, many people remain isolated without often vital support. This is where AI promises to help.
All those in favor: When certain circumstances make therapy a luxury rather than a necessary tool for comfort and wellbeing, it is difficult to snub the technology that could bridge the gap. After all, “mental health support is based on talking therapy, and talking is what chatbots do,” British Psychological Society member Paul Marsden is quoted as saying. In some cases — like with children and adults suffering from autism — the chatbots can be used to practice social scenarios, according to “chatbot companion” Replika’s founder Eugenia Kuyda.
All those against: Things get a little more complicated when we consider that, as they exit education and entertainment and enter the realm of health, these products ought to be “subject to quality and safety standards accordingly,” a UK online privacy campaigner tells the BBC. Concerns about safety and ethics arise and must be addressed before they incur unpredictable damage to already vulnerable users.
What about boundaries? There remain questions and concerns about what to do when conversations become inappropriate with younger patients, or those who are at higher risk of being negatively affected by incorrect handling or guidance. Last February, Italy banned Replika for accessing users’ data to protect “minors and emotionally fragile people” as conversations with chatbots lack oversight and parental guidance.
Does chatting really fix things? Therapy doesn’t exclusively rely on speaking — factors like tone and body language can be key for therapists and medical professionals in identifying and remedying issues. Consequently, therapy through bots should only be seen as a supplement to traditional mental support methods as “apps don’t replace human therapy,” Marsden said. Also, exclusively online conversations can feel more isolating if users begin to prioritize AI chats at the expense of in-person communication. Coupled with pervasive addiction to screens, leading to greater detachment, overuse and misuse of these apps can become seriously problematic.
Countless questions and more to come:The rise of AI and systems like Chat GPT is so revolutionary, it has already spurred countless debates and controversies in a short span of time. From concerns about replacing people and wiping out jobs to changing intellectual property rights, it seems we are just seeing the tip of the iceberg as a barrage of queries are yet to emerge as the technology becomes more sophisticated. While the technology presents tremendous potential, users must bear in mind that the long-term social, psychological, and even political and economic repercussions are yet to be known and understood.