Can Chat GPT Be Used For Therapy?

Juan Umbarila
As AI uses expand to more areas of human life, people are using it for all sorts of things, including therapy work. But even though AI can be helpful in therapy-related areas, it is not yet a replacement for a human professional.
Some people have found using Chat GPT helpful with their therapy needs. Photo: Possessed Photography | Unsplash

People use AI chatbots in all sorts of creative ways, from helping out with repetitive tasks to optimize work, to writing books and even publishing them for profit. As the technology advances and becomes more extended, more uses will rapidly become possible and widespread.

One of them, that you may have not thought about, is therapy work. Testimonies can be seen on YouTube and Reddit of people using Chat GPT as a therapist, many with positive results. Mashable reported the case of a young adult who successfully used it to improve his struggle with ongoing melancholic feelings.

If you think about it, it’s not that surprising, given that Chat GPT has a natural language model that makes it emulate empathy very well when prompted correctly; and also given that therapy is still a taboo in many parts of the world, as well as prohibitively expensive or just simply unavailable for many people.

“Some AI enthusiasts see chatbots as having the greatest potential in the treatment of milder, commonplace conditions such as anxiety and depression, the standard treatment of which involves a therapist listening to and validating a patient as well as offering practical steps for addressing his or her problems,” reported Alzajeera.

But Chat GPT’s resources are limited, as it cannot diagnose you and follow your progress as a professional therapist would; and it can sometimes give you inaccurate or outright wrong information with high confidence. While it can be used as a supporting tool, professional human help is always the way to go when seeking mental and emotional medical health.

Doing A Hypothetical Test

Chat GPT’s interface. Photo: Emiliano Vittoriosi | Unsplash

In my previous experiences with Chat GPT, I had found its tone a little dry, and even brusque at times, but when I prompted the words “I had a panic attack” it was pleasantly empathetic, and even offered some sensible advice.

The first of which was a reminder that it is not a healthcare professional, and that professional guidance and support is highly recommended. Then it made a list of possible helping techniques, including deep breathing exercises, muscle relaxation, grounding techniques (a standard treatment for panic attacks), and even suggested some Cognitive Behavioral Therapy (CBT).

Then I just asked if it could make that advice into a weekly plan for me, to which it happily obliged. According to the plan, Monday looks like this:

  • “Morning: Start the day with 10 minutes of deep breathing exercises. Focus on slow, deep breaths to help regulate your breathing and promote relaxation.
  • Afternoon: Research and find a local therapist who specializes in anxiety or panic disorders. Schedule an appointment for an initial consultation.
  • Evening: Engage in a relaxing activity before bed, such as reading a book, taking a warm bath, or practicing gentle stretching.”

Helpful and yet commonsensical. To give another example, this is Wednesday:

  • Morning: Allocate 10 minutes for journaling. Write down any anxious thoughts or worries that come to mind. Challenge negative thoughts and reframe them with positive and realistic statements.
  • Afternoon: Engage in a physical activity that you enjoy, such as going for a walk, practicing yoga, or dancing. Exercise can help reduce stress and promote overall well-being.
  • Evening: Prioritize a good night’s sleep by establishing a relaxing bedtime routine. Avoid screens and stimulating activities before bed. Instead, opt for calming activities like reading or listening to soothing music.

Overall, I found it extremely helpful taking into account that I had just given it two prompts. As you customize the queries, you can make its responses more personalized to your specific conditions and needs, as did a Reddit user who prompted it to act like a therapist with a Ph.D. in Psychology specializing in CBT.

Just Because You Can Doesn’t Mean You Should

No AI can replace professional human therapy just yet. Photo: SHVETS Production | Pexels

There is a potentially dangerous mirage of AI bots satisfactorily replacing human workers because they (the bots) seem very confident when prompted to accomplish requested tasks. They can actually get it wrong often, and users should exercise caution, even more so when dealing with their own health.

Actually, Chat GPT’s company Open AI states in its Usage Policies that “OpenAI’s models are not fine-tuned to provide medical information. You should never use our models to provide diagnostic or treatment services for serious medical conditions.”

As it is in most cases, AI can be very helpful as a supporting tool, but it’s not advisable as the full replacement of a human professional.

However, it is indeed true that Mental Health services can be of difficult access for many people who need them, and AI could fulfill a role in the future to help reduce that problem.

Total
0
Shares

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Posts