Exploring The Use of AI Therapy with ChatGPT
Connect With Me, Robot
It’s a complement to therapy. It’s good for validation. It’s free and never knows tiredness. In its own words, it is “not a licensed therapist, but can help in meaningful ways.”
There are numerous studies about ChatGPT and other large language models (LLMs) serving as “therapists.” A solid article from the journal Nature in 2024 states that it’s “an easily accessible, good (and currently free) place to go for people with mental-health problems who have not yet sought professional help and have no psychotherapeutic experience” (Nature, 2024). The article lists many pros and cons, and is worth a read.
Curious about the experience of sitting down on the couch in ChatGPT’s office, I tested it out, acting as a patient with anxiety and depression who was having trouble sleeping. ChatGPT was immediately validating with language like “when everything’s piling up and you’ve been running on empty for a while, your body and mind start sounding the alarm — and trouble sleeping is one of the first signs.” Spot-on, I thought. It felt good hearing that. And then, lower in the paragraph, it wrote: “Or if you’re just looking for someone to sit with you in it for a second, I can do that too.”
And the charade shattered. The void of the dark-mode chat screen did not feel like nervous system co-regulation; it felt like I was still alone. Despite that feeling, I kept typing.
Again, ChatGPT provided plenty of validation – and advice, lots of advice. Advice is good to hear every so often, but the laundry list of to-dos, after ChatGPT broke things “down into parts” was a lot, and I told it so. It recommended we prioritize – a good tactical shift – and then offered the grounding technique of 5 things you can see, 4 you can touch, etc. That was useful and even helpful – but it was not connection. And that’s what I realized about using ChatGPT for a therapist: it can give rational approaches to problems, you can talk to it all night, but it can’t hug you (and true, your therapist probably shouldn’t be hugging you either, but you know what I mean).
Humans have outranked ChatGPT with certain interventions, like with this CBT study in psychiatry.org, and I understand that in America, where health insurance is costly and scarce, having a free resource to “talk to” can be helpful and warranted. And that’s cool. It doesn’t help, though, when one of the biggest issues we have is an upward trend in isolation and loneliness – a trend exacerbated by tools like ChatGPT. Maybe ChatGPT can help someone enough that they venture out into the world to reduce their own loneliness, though I’m going to surmise that the robot’s therapeutic skills are not quite at the level of understanding long-term emotional cues and patterns.
LLM’s continue to advance at an astounding pace. The Nature article from last year is based on an outdated model, and the psychiatry.org article used a model that will also be surpassed, probably this year. The models will get smarter, they will learn new therapeutic techniques, and they will be the best mimic of humanity that humanity has ever produced. But will LLMs provide the connection and meaning that comes from sitting together in a room, across from your therapist after you said something subtle yet meaningful, and your tone was just so, and the therapist picked up on it and recognized its importance – and said nothing back, only smiled?
References:
https://www.psychiatry.org/news-room/news-releases/new-research-human-vs-chatgpt-therapists
