Can ChatGPT replace human therapists?

ChatGPT can now speak with users, making conversations with the chatbot more personal. What happens when people begin to use the AI large language model as a therapist?

December 08, 2023 02:55 pm | Updated 03:42 pm IST

The Hindu tried out the free ChatGPT version with the voice mode, where users could choose from friendly male, female, and gender-neutral voices in order to get spoken answers [File]

The Hindu tried out the free ChatGPT version with the voice mode, where users could choose from friendly male, female, and gender-neutral voices in order to get spoken answers [File] | Photo Credit: REUTERS

An OpenAI officer came under fire on X (formerly Twitter) in September for comparing her conversation with ChatGPT to therapy, despite admitting she had never experienced therapy before.

Lilian Weng’s X bio said she worked on AI safety at the startup behind the viral chatbot while the company website noted she worked on Applied AI Research.

“Just had a quite emotional, personal conversation w/ ChatGPT in voice mode, talking about stress, work-life balance. Interestingly I felt heard & warm. Never tried therapy before but this is probably it? Try it especially if you usually just use it as a productivity tool,” she posted on X on September 26.

Just a day earlier, OpenAI said it had upgraded ChatGPT with voice and image capabilities, allowing users to chat with it, share photos, and listen to responses. The voice mode feature is now open to all users.

OpenAI suggested that people could use this upgrade to settle family dinner table debates or even have a bedtime story read to them.

(For top technology news of the day, subscribe to our tech newsletter Today’s Cache)

Weng was criticised for seemingly promoting ChatGPT as a tool capable of providing therapeutic services, and especially after admitting that she was not qualified to comment on the therapy experience. Others accused her of coming under the ELIZA effect.

What is the ELIZA effect?
This refers to the phenomenon in which people think computer programs or similar systems have become capable of human emotions or functions because of the way they respond to user input.
The ELIZA effect is named after a 1960s computer program which also responded to users, albeit with very basic sentences that echoed their original words. This program was credited to MIT professor Joseph Weizenbaum.
A user who receives responses from ChatGPT may believe that they are having a meaningful or mutually beneficial interaction that is somehow comparable to a human conversation, rather than seeing ChatGPT as a large language model simply generating data. This is an example of the ELIZA effect.

A day later, Weng posted that people’s interactions with AI models differed and that her statements were her personal opinion.

Even so, the OpenAI employee is far from the only person who has turned to ChatGPT—the world’s fastest growing consumer app earlier this year—to find solutions for their mental health challenges or even just a “warm” listener.

ChatGPT: free of cost and available 24x7

Sanskriti*, 29, a journalist based in Mumbai, goes to (human-led) therapy but has also reached out to ChatGPT to get help with life challenges, even though she is aware this is not recommended and knows the chatbot generates incorrect answers at times.

“I only book therapy when I need clarity, or things are getting out of hand in terms of anxiety, or there is a new pattern I am noticing, because therapy is expensive, and usually when you are anxious it arises [in] the middle of the night or at some weird hour and you want to quickly calm yourself down,” she said.

She explained that her anxiety was usually driven by not knowing something or having misleading information. ChatGPT helped her find the certainty she needed in order to calm down.

“Therapists can’t be available at the break of dawn,” she pointed out.

Sanskriti noticed that when entering questions into the chatbot that would better suit a human therapist, ChatGPT did provide counsel, along with a disclaimer encouraging her to seek the services of a qualified professional.

“Is ChatGPT comparable to human therapy? Of course not,” Sanskriti said. “I would say ChatGPT really helps, but I think there’s a lot of communication with a human therapist. You don’t have to use certain formats to write [to a human therapist]. With ChatGPT, you need to know the commands, you need to know how ChatGPT will respond.”

The Hindu tried out the free ChatGPT version with the voice mode, where users could choose from friendly male, female, and gender-neutral voices in order to get spoken answers. After complaining via the ChatGPT app that the user was suffering from a cold, a voice named “Ember” was sympathetic and suggested home remedies such as warm soup, before offering to help with more specific symptoms.

Knocking at the digital door

Many people worldwide turn to the internet to seek out health information privately, according to Jim Downs, historian and author of the book Maladies of Empire: How Colonialism, Slavery, and War Transformed Medicine.

“Web-MD and other online resources, like ChatG[P]T, allow for an anonymity that many patients desire to avoid the stigma of being labeled unhealthy or sick. Historically, patients have feared clinicians pathologizing their behavior, bodies, and, even, their identities,” Downs said.

However, AI chatbots may not be ready to bridge this doctor-patient divide. Researchers have claimed that AI chatbots largely generate results which favour India’s privileged castes and economic classes while possibly excluding marginalised communities, reported Reuters in September. In the healthcare sector, ChatGPT was shown to generate false results claiming there were physiological differences in Black people’s bodies when compared to other races, per a study in the Digital Medicine journal.

Looking back, Downs explained that fields such as epidemiology (the study of diseases and how they affect groups of people) largely emerged from slavery or colonialism, with doctors studying the spread of diseases in subjugated populations—such as Indians under British rule in the 19th century.

“This same pattern applies to the birth of psychology as a field. It emerged from a specific historical case studies that articulated Indian and other people of colour throughout the world as inferior to white Europeans,” Downs said.

He pointed to scholar Sunil Bhatia’s book Decolonizing Psychology, which explores how early psychology was based on unscientific ideas of white supremacy over colonised Indians. Downs stressed on the importance of including multiple cultures and complex identities when recognising health conditions even today.

“Therefore, we need to be suspicious of AI generated understandings of health and illness because they may propagate Eurocentric understandings of medicine that fail to recognize the cultural specificity of people India and in other cultures around the world,” he said.

A stop-gap arrangement?

Sanskriti noted that matching with the right (human) therapist is a process of trial and error, but insisted that finding one is still important.

“I feel I am more at ease when I’m talking to a therapist because it’s a more free-flowing conversation. Her insights are deeper. It also comes from the previous sessions,” she said, explaining how her therapist was able to identify triggers or link certain anecdotes to her childhood—which ChatGPT cannot do.

On the other hand, Sanskriti recalled how ChatGPT helped her when she was facing a medical emergency at home. The chatbot answered some of her healthcare doubts, which let her decide what kind of specialists to contact and what her next step should be. Her therapist would not have been qualified or even permitted to help in this area.

“It is not layered, it is not insightful. But I feel sometimes ChatGPT manages to ease my anxiety for that very moment,” Sanskriti admitted.

(*Name changed to protect privacy)

Top News Today

Sign in to unlock member-only benefits!
  • Access 10 free stories every month
  • Save stories to read later
  • Access to comment on every story
  • Sign-up/manage your newsletter subscriptions with a single click
  • Get notified by email for early access to discounts & offers on our products
Sign in

Comments

Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide by our community guidelines for posting your comments.

We have migrated to a new commenting platform. If you are already a registered user of The Hindu and logged in, you may continue to engage with our articles. If you do not have an account please register and login to post comments. Users can access their older comments by logging into their accounts on Vuukle.