Can AI Therapy Really Help with Emotional Healing?

What if I told you that a machine might soon be considered a better therapist than a human? That's the claim emerging from a recent article seeking to explain the findings of this study, but there's one crucial element that AI will never replace: the human connection.

The funding source for this study isn't transparent, which is crucial for understanding potential biases or conflicts of interest. Clients have a right to know who is funding research that may be used to validate the use of AI in therapy. More importantly, this study doesn't measure therapy outcomes—something that would be essential to determine if AI is indeed a better therapist than a human. Without this, the claim that AI could outperform human therapists lacks evidence - which the title "A New Study Says ChatGPT Is A Better Therapist Than Humans — Scientists Explain Why" seems to imply. Accurate representation of research findings in the media is just as crucial as transparent research funding. 

Additionally, the study doesn't account for the therapeutic relationship between client and therapist, a key factor in how clients receive therapeutic responses. The media, however, has implied this conclusion, even though the study itself does not make such a claim.

The therapeutic alliance, or therapeutic relationship, with the client, has long been known as a mediator in therapy outcomes and often the strongest predictor of success in therapy. AI is coming, and we must be realistic about how it is wielded. One argument usually made in favor of AI therapy is that it increases accessibility—it's cheaper, always available, and doesn't require waiting for an appointment. These are valid concerns, especially given the cost and scarcity of mental health care. However, accessibility shouldn't come at the expense of quality. AI-generated responses may feel helpful in the moment. Still, therapy is not just about receiving insight or advice—it's about the process of working through emotions through a relationship with another person. The discomfort and growth that come from real therapeutic work cannot be replicated by an AI providing a well-worded response.

This article is misleading because the study does not measure therapeutic outcomes. This is also a glaring limitation not listed in the study's limitations. The article's title, "A New Study Says ChatGPT Is A Better Therapist Than Humans — Scientists Explain Why," implies scientists studied and compared therapeutic outcomes, that is, if clients got better through treatment. This was not one of the three key research questions asked either: 1. Can a panel of participants tell the difference between responses written by a knowing expert {therapist} and responses created using GenAI? 2. Compared to the ChatGPT responses, does the panel of participants rate responses written by knowing experts to be more, less, or similarly in line with the common factors of therapy? 3. Are there sentiment and part of speech differences between responses written by knowing experts than those created by ChatGPT?

The final question interests me and poses a post-hoc question: Does controlling for response length and parts of speech reduce effects? Researchers found that AI provided longer responses, using more nouns and adjectives. It is important to remember that both AI and therapist were responding to vignettes, not actual conversations in therapy. When choosing a response to a vignette, participants are more likely to rate a longer message as more desirable because it provides more clarity. Therapists aren't trained to give you clarity; a therapist is trained to help you find your own clarity, which naturally evokes discomfort because most people presenting to therapy have a certain level of discomfort or distrust of themselves. In my experience as a therapist, much of the real work happens in the moments of tension and discomfort—moments an AI can't fully engage with or respond to dynamically. Therapy isn't just about hearing the 'right' words; it's about how they land, challenge, and help people move through their struggles meaningfully and how they adapt to the client's life experience.

An additional finding in response to research aim two was that participants tended to rate responses they believed to be therapists higher than responses they believed to be AI. I interpret this finding as there is a desire to connect with other humans and still considerable distrust for AI. 

Using AI as therapy raises several concerns, but I want to focus on privacy, data collection, and the motivations of companies offering AI therapy. You are putting your data and privacy on the line; we do not yet know the outcome. How do you feel about potentially paying to train an AI while giving some of our most closely held information - information that is often a struggle to even admit to yourself?

AI has the potential to be therapeutic, but it can never replace the unique experience of therapy. Working with a therapist involves a relational aspect that an AI cannot replicate. AI will never be a fellow human sitting in the same room with you, reading your body language, tone of voice, or shifts in your tone. These are all crucial elements that therapists train to assess. What often drives distress is the avoidance of questions about ourselves and the inability to sit with these questions due to toxic coping mechanisms and societal pressures. When you engage with AI, you're unable to explore or reflect on the questions you might not even know to ask. To get AI to assist you in reflecting this way, you'd need to ask specific questions and provide a detailed framework for how you'd like it to approach the conversation—something you might not know how to do. These are aspects that AI cannot fully engage with or respond to dynamically. 

Therapy is a creative act that incorporates the nuances of lived experiences into the healing process. Therapy provides the space where disagreement, discomfort, and vulnerability lead to growth. The process you endure to achieve growth is the heart of therapy—something AI cannot replicate. While AI might provide well-worded responses, advice, or role-play scenarios, the real work of therapy happens when you're face-to-face with a human, navigating the complexities of emotions, being vulnerable and accepting of your vulnerability, and having your vulnerability accepted by another. It's in these moments, looking another person in the eye, that transformation occurs. 

We must remain clear about AI's limitations as it enters the mental health field. AI can offer support and be a tool, but it cannot replace the human connection that makes therapy effective. We need to have open conversations about AI's boundaries in this space and ensure it's used thoughtfully, not as a substitute for the profoundly human work of therapy. After all, the real power of therapy is not in the answers you receive but in the way you learn to trust your own answers.

Previous
Previous

Why do I still doubt myself even with good feedback?

Next
Next

Are My Expectations for Myself Too High?