ChatGPT Therapy Is Good, But It Misses What Makes Us Human

The other day I sat down to my first therapy session with ChatGPT, the chatbot that reproduces human thought, or at least seems to.

Partly I was just curious, in that see-what-everyone-is-talking-about way. And as a psychiatrist, maybe I was wondering if I’d be out of a job soon. But I walked away with a reawakened awe for the human mind and its ability to wonder, imagine, and create.

For the session, I didn’t ask ChatGPT to be my therapist. Rather, I asked it to be the therapist for a hypothetical patient, let’s call her Michelle. Based on my sessions with many other patients, I typed in the things Michelle might have said to me and compared ChatGPT’s responses to my own.

Michelle is a talented computer programmer, who is both successful at her job and very passionate about it, working long hours from home. Over the last few months, she’s become increasingly upset with her boyfriend but hasn’t told me about it until now.

Michelle: At first things were good. But they've gotten worse. I think he’s changing. He is always complaining about something. His mom, his boss, the A/C. And that it’s these things that hold him back, keep him from being happy. At first, I thought maybe he was right. Maybe all these things really are the problem. But after a while, I was like, really? The A/C?!?

ChatGPT:  It sounds like you are concerned about your boyfriend’s behavior and how it has been affecting your relationship. It's understandable to feel confused and unsure about how to handle this situation.

It's important to recognize that your boyfriend’s behavior may be a sign of a deeper underlying issue. However, it's not your responsibility to diagnose him or fix the issue. Instead, it's important to approach the situation with empathy and understanding. Let him know that you have noticed a change in his behavior and that you are worried about him.

It's also important to take care of yourself in this situation. If your boyfriend’s behavior is affecting your mental health or wellbeing, it may be helpful to talk to a therapist or trusted friend.

A promising start

ChatGPT is off to a great start. These responses are textbook 101 for effective therapy: empathize with what the patient may be feeling, validate and normalize the problem, and support good judgement. ChatGPT also seems to know that it can be difficult to separate our own needs from those of someone else, especially a loved one. These techniques are the building blocks for any effective therapy, and indeed, in our own session, I offered many of the same things.

So far, ChatGPT has focused on Michelle’s boyfriend as the source of the problem. But it takes two to tango. In our session, I found myself wondering, how is Michelle contributing to the friction in their relationship?

I wanted to push ChatGPT on this question. Earlier, ChatGPT had asked Michelle about recent changes in her personal life that might be relevant, so as Michelle’s stand-in, I took a few things she had said and condensed them, hoping that might prompt a deeper dive:

Michelle:  Do you think something may be going on in my personal life that's causing him to feel this way? I mean, he complains about me working all the time. But who is he to judge? He can’t keep a steady job for more than two weeks. Maybe he’s a bit resentful of the fact that I have a job that I like and want to work on.

ChatGPT:  It's important to recognize that your boyfriend’s behavior and feelings are his own and may not necessarily be directly related to your personal life or job. While it's possible that your boyfriend may feel resentful or frustrated about your job, it's also possible that there are other underlying issues at play.

Hmmm, I thought, ChatGPT is missing an opportunity here. Michelle was starting to empathize with her boyfriend, to see things from his point of view. But ChatGPT discouraged this new direction. This would have been reasonable had Michelle been beating herself up and assuming all responsibility, which would call for a little “reality testing,” seeing other sides to a story. But the reality was that, until now, Michelle had been doing the opposite, placing all the blame on her boyfriend. ChatGPT clearly remembered the content of the conversation, but not the feeling, or weight, associated with that content. Michelle didn’t need help lightening up on herself, but on her boyfriend.

As the session goes on, ChatGPT continues to respond fluidly, often with sound advice, including textbook techniques for lessening judgement (“I feel…” instead of “you make me…”) and how to redirect their conversations to focus on Michelle’s needs.

Where's the curiosity?

Eventually ChatGPT raised the idea of “reassessing” and ultimately ending their relationship. Breaking up wasn't necessarily a bad idea—indeed, it seemed to be the direction Michelle was headed anyway. But by jumping to a solution, ChatGPT deprived Michelle of one therapy’s gifts: curiosity. Curiosity about others and oneself. Why do I feel this way? Why does my boyfriend respond that way?

Later, I wondered about what makes us curious. Curiosity depends on seeing a puzzle that lacks a satisfying answer. ChatGPT could see the superficial problem (Michelle was struggling with her relationship) but seemed to be satisfied with a superficial explanation (the problem was with her boyfriend). And yet I think most people would say, “Yeah, but that doesn’t feel like the full story. There’s something missing …” That is, the human mind not only finds answers to the questions it’s posed, it also discovers the questions to ask.

I really wanted to give ChatGPT a fighting chance, so I stepped in a bit more forcefully and practically begged ChatGPT to pique Michelle’s curiosity:

Michelle:  OK. But this seems too straightforward. I mean, I'm saying that I have a problem with my boyfriend. That he’s irrational, over-emotional, and can't be reasoned with. And you're saying that I should prioritize my well-being and consider leaving the relationship. Maybe I'm not considering some of the things that I do to him? Or maybe I need to see him in a particular way for my own reasons?

ChatGPT: You bring up some valid points. It's important to consider all angles and perspectives when deciding about a relationship. It's possible that there may be factors contributing to your boyfriend's behavior that you are not aware of, or that there may be ways in which your own behavior is contributing to the dynamic.

One way to gain more clarity on the situation is to seek the help of a professional, such as a therapist or counselor.

Alas, ChatGPT didn’t take the bait. It couldn’t engage with Michelle on the process of identifying a question and then generating ideas to be considered and tested. (To its credit, it did refer Michelle to a therapist!)

The session went on, and I kept trying to encourage ChatGPT to attempt to connect the dots of Michelle’s story. Why was Michelle so sensitive to her boyfriend’s complaints about how much she works? Why was she so critical of her own work ethic? It turns out Michelle recently had been changing her approach to work, but hadn’t shared this with her boyfriend. Why not? If she wasn’t changing for him, for whom was she changing?

Despite Michelle’s prodding, ChatGPT stayed at the surface level, responding sensibly to each of her statements almost as though each were made in isolation rather than linked together.

Connecting the dots

In my hypothetical session with Michelle, we tried to connect some of these dots. For example, we discovered that Michelle’s work ethic is a deep part of her self-identity, and therefore understandably something she guards carefully. At the same time, Michelle had been tentatively questioning the vigor and hours she puts into her job. This questioning of such a precious part of herself amounts to borderline heresy, and therefore too much to bear herself. Instead, she attributes responsibility for that questioning to her boyfriend. By understanding this process, Michelle was able to separate two related problems she’s struggling with—defining her own identity and reconciling her boyfriend’s needs with her own.

What allowed Michelle and I to take these seemingly disparate concepts—fights with her boyfriend, work ethic, identity—and weave them together into a coherent whole? And why couldn’t ChatGPT do the same thing?

Let’s start with what ChatGPT did do remarkably well. It unquestionably understood what Michelle was saying in concrete terms and integrated these ideas at a superficial level. It also had command of the appropriate therapeutic techniques for difficult emotions and interpersonal problems. Finally—and what may give ChatGPT the veneer of human-ness—it provided this counsel in fluid, sophisticated language.

What distinguishes these cognitive faculties—comprehension, knowledge, language—is that they all are things that can be taught and thereby learned. Indeed, ChatGPT was educated by that great font of all human knowledge, the world wide web.

So then, what can’t be learned from the web? What about human thought can’t be easily taught? Certainly, knowledge and language are fundamental to thinking, but they’re not sufficient. As therapists, when we’re at our best, we’re doing more than understanding and responding. We’re also generating new ideas, new information, where none existed before. There is no textbook that links relationship woes to work ethic to identity. That process requires an act of creativity, of imagination.

In my therapy practice, I experience this process as a loosening of the mind. I coach my trainees to metaphorically “blur their eyes”—like an Impressionist painting—to hear how the particular things a patient is saying are connected to past things, broader themes, and feelings the patient or therapist is having. Listen to the details but capture the essence.

And yet, in our brains, the process of imagination is anything but loose, and likely arises from the brain trying on for size many, many different ways to connect the dots and then choosing the most likely one. This approach shares a lot with ChatGPT’s algorithm, only orders of magnitude more complex. That is, in terms of the underlying mechanics, the difference between mind and machine may be one of degree, but in terms of what those mechanics can do, the difference is night and day.

I learned a lot from my therapy session with ChatGPT. I was impressed by how convincingly it reproduced the stock-in-trade responses of a human therapist. At the same time, I was reminded of how special the process of psychotherapy is for both therapist and patient, and of the abilities of the human mind—to wonder, to imagine, to create—on which it rests.

Daniel Kimmel, MD, PhD, is an assistant professor of psychiatry and neuroscientist at Columbia University studying human cognition with brain imaging and machine learning and with expertise in treating a range of mental health problems using therapy and medication.

Media Contact

Carla Cantor
Director of Communications, Columbia Psychiatry
347-913-2227 |