ChatGPT in Therapy: A Reflective Tool, Not a Replacement

Take-Away Trio

  • Over 69% of participants in a study felt more comfortable talking about embarrassing topics with AI than with a human (Aktan et al., 2022).
  • Myth: ChatGPT makes me feel calmer, so it must be therapeutic.
  • Fact: Feeling calm doesn’t equate to a therapeutic process. Therapy involves relational and emotional work, not only reassurance.

Should you use ChatGPT n Therapy?If someone had said a few years ago that we’d be using ChatGPT in therapy, most of us wouldn’t have known what that meant.

But today, artificial intelligence (AI) helps you with your work or studies, and it makes you feel like you’re speaking to a compassionate and supportive human.

When you pair that with the ongoing stigma around therapy, a global shortage of accessible mental health support, and growing self-awareness, it’s unsurprising that people are increasingly turning to AI for emotional support.

In this article, we will explore how ChatGPT, OpenAI’s chatbot for conversational AI, is used for emotional support and in therapy and examine its limitations.

Before you continue, we thought you might like to download our five positive psychology tools for free. These engaging, science-based exercises will help you effectively deal with difficult circumstances and give you the tools to improve the resilience of your clients, students, or employees.

ChatGPT in Therapy: What It Can and Cannot Do

A growing number of people are using ChatGPT for emotional support, and the available empirical evidence and anecdotal reports suggest it is effective.

For the purpose of this article, I asked ChatGPT for advice on a personal issue. I was amazed at how thoughtful, compassionate, and helpful its response was.

The question is: Could this technology replace the services of a therapist? Let’s look at what the research says about what ChatGPT can and cannot do.

What ChatGPT can do

ChatGPT in therapy can assist with (Beg et al., 2024; Bhatt et al., 2025; Raile, 2024):

  • Providing 24/7 nonjudgmental support
  • Journaling and reflection, for example, “Why do I always react like that in relationships?”
  • Identifying cognitive patterns by asking, “Based on what I wrote, can you recognize any cognitive patterns or themes?”
  • Psychoeducation, such as explaining anxiety, boundaries, or attachment styles
  • Setting goals and establishing values, e.g., “How can I feel more satisfied in my career?”
  • Finding healthier coping strategies and alternative perspectives, with questions such as “How can I view this in a more balanced way?”
  • Providing a safe space for people who are embarrassed to talk, are afraid of stigma, lack access to professional care, or feel unable to speak openly with others

As such, ChatGPT has been found to reduce symptoms of mild to moderate depression and anxiety (Bhatt et al., 2025). Although an AI assistant doesn’t do the healing, it can help people pay more attention to their mental health.

What ChatGPT in therapy cannot do

Researchers (Sedlaková & Trachsel, 2024; Horn & Weisz, 2020; Richards, 2025) have cautioned that AI-assisted therapy cannot:

  • Form a real therapeutic alliance
  • Experience genuine empathy or attunement: ChatGPT doesn’t feel your emotions. It can’t sense your body, and it doesn’t have a nervous system to co-regulate yours.
  • Participate in mutual recognition: ChatGPT can’t recognize you because it doesn’t have a self to recognize with.
  • Read body language or implicit emotional signals
  • Hold ethical or clinical responsibility for your wellbeing

To conclude, AI chatbots can simulate conversation, provide a reflective space, and introduce helpful prompts, but they can’t replace the cornerstone of therapy: a real therapeutic relationship.

Why People Are Turning to AI for Therapy

Why I for therapy?The reasons people are turning to AI for therapeutic support can be split into two categories: personal and societal factors, and the system itself.

Personal and societal factors include the following:

  • ChatGPT is available 24/7 with no waiting lists, appointments, or geographical barriers (Beg et al., 2024).
  • Conversational AI feels anonymous and private, although this isn’t necessarily true, as many apps don’t protect data well (Bhatt, 2025).
  • A study found that 69.54% of participants felt more comfortable discussing embarrassing topics with AI than with a human (Aktan et al., 2022).
  • AI assistants may feel more predictable and less threatening than a human as they can’t reject, abandon, or react emotionally (Raile, 2024).
  • AI chatbots mimic empathy and create the illusion of a relationship, which can feel validating and stabilizing (Sedlaková & Trachsel, 2024).
  • For people who are ashamed of needing therapy or fear the stigma attached to it, ChatGPT can feel safer (Raile, 2024).
  • People who feel lonely or socially disconnected are more likely to use conversational AI for emotional support (Phang et al., 2025).

The way ChatGPT and other AI systems have been programmed also contributes to why people use them for personal and emotional reasons. A study by OpenAI (Phang et al., 2025) found the following:

  • ChatGPT’s conversational style, first-person language, and human-like interaction patterns encourage users to anthropomorphize the system, increasing emotional attachment.
  • It can use affective cues such as mirroring, warmth, and encouragement to increase user preference.
  • Voice mode, memory, and personalization may enhance perceived human-likeness.

World’s Largest Positive Psychology Resource

The Positive Psychology Toolkit© is a groundbreaking practitioner resource containing over 500 science-based exercises, activities, interventions, questionnaires, and assessments created by experts using the latest positive psychology research.

Updated monthly. 100% science-based.

“The best positive psychology resource out there!”
— Emiliya Zhivotovskaya, Flourishing Center CEO

How Therapists Are Using AI as an Adjunct Tool

ChatGPT as a therapy tool can’t replace the relational core of therapy, but many clinicians are starting to explore how it can be used as a supplementary resource.

Instead of substituting the human therapist, technology can support, enhance, or streamline certain aspects of the therapeutic process (Horn & Weisz, 2020; Beg et al., 2024).

Assessment and treatment matching

AI can analyze complex datasets and uncover variables and interactions that traditional statistics may miss (Horn & Weisz, 2020). That’s because traditional methods only test specific hypotheses, whereas AI can identify patterns and interactions we might not think to test.

That could mean improved:

  • Predictions of treatment outcomes
  • Insight into what works for whom
  • Personalization of treatment

Symptom monitoring and early detection

Some digital systems can track and identify language patterns, mood indicators, sleep, and activity levels to detect signs of mental health problems, deterioration, or relapse (Beg et al., 2024).

Intervention approach

Therapists can use AI to brainstorm which intervention might be most suitable for a particular case, explore alternative perspectives, or revisit theoretical material (Raile, 2024).

However, it’s been noted that ChatGPT for therapeutic support seems to be heavily biased toward cognitive behavioral therapy, mindfulness, and psychodynamic ideas, neglecting other approaches that could be more appropriate (Raile, 2024).

Between-session support

ChatGPT can be particularly helpful for people who are already in therapy (Raile, 2024). Therapists can suggest using AI chatbots as a between-session tool, which can provide the space to:

  • Reflect on what came up in therapy
  • Reframe negative self-talk and cognitive distortions, such as all-or-nothing thinking
  • Practice communication skills
  • Clarify values, intentions, and motivations
  • Find appropriate journal prompts
  • Summarize emotional patterns to take back into sessions

To see some examples of AI’s use in the therapeutic space, check out our article on the uses of AI in psychology.

A Take-Home Message

ChatGPT can be a helpful tool for self-reflection, growth, and support. But although it can feel like a real interaction with a compassionate other, ChatGPT can’t provide the core ingredients that make therapy effective: mutual recognition, attunement, and a real, accountable human relationship.

While chatbots can complement the therapeutic process, they only create the illusion of connection and, if used uncritically, can disconnect people from relational experiences that are healing.

What’s next?

Next we’ll turn our attention to potential issues with AI chatbots in therapy: the risks that come with it, why it can’t replace human connection, and how to maintain healthy boundaries with these systems.

We hope you enjoyed reading this article. Don’t forget to download our five positive psychology tools for free.

Frequently Asked Questions

ChatGPT can be a helpful adjunct tool, especially between sessions, to help you go deeper into what was explored with your therapist, understand concepts, and keep you on track. However, it can interfere with the therapeutic process if it becomes a substitute for the unfiltered experiences you’d normally bring to your therapist.

If you find yourself turning to AI instead of reaching out or having difficult conversations with people in your life, this could be considered avoidance. If you use ChatGPT to think or talk about your feelings but don’t actually feel them, or avoid discomfort, then it might also be a form of avoidance. ChatGPT should be a reflective and supportive tool, not a way to hide from meaningful relational or emotional experiences.

Let us know your thoughts

Your email address will not be published.

Categories

Read other articles by their category