AI therapy offers accessible mental health support through technology, providing users with therapeutic conversations & exercises via apps & platforms.
It complements traditional therapy by offering continuous support, tracking progress & providing immediate responses, though it doesn’t replace human therapists.
Ensuring ethical standards & privacy are essential for the successful integration of AI therapy in mental health care.
Artificial intelligence (AI) is ushering in a new era in mental health care, transforming everything from diagnostic accuracy to the delivery of therapeutic interventions (D’Alfonso, 2020).
With the increasing demand for mental health care, AI offers cost-effective support to practitioners and clients, and, in some cases, a replacement for human-led treatments (Minerva & Giubilini, 2023).
In this article, we highlight existing uses of AI in therapy and mental health treatment, along with their potential benefits, challenges, and risks. We also imagine a possible future for AI therapy in promoting mental wellbeing.
Before you continue, we thought you might like to download our three Positive Psychology Exercises for free. These science-based exercises explore fundamental aspects of positive psychology, including strengths, values, and self-compassion, and will give you the tools to enhance the wellbeing of your clients, students, or employees.
While for many of us, AI has only recently come to our attention, as far back as the 1950s, Alan Turing wrote the seminal paper Computing Machinery and Intelligence. It opens with the sentence, “I propose to consider the question, ‘Can machines think?’” (Turing, 1950, p. 433).
By 1966, we saw the beginning of artificial intelligence in psychology, with the early chatbot Eliza convincing patients they were conversing with a real therapist (Mullins, 2005; Weizenbaum, 1976).
In recent years, with algorithms able to draw powerful statistical inferences from large amounts of data, AI has revolutionized the potential of mental health care to meet patient needs (D’Alfonso, 2020; Holohan & Fiske, 2021).
When OpenAI unleashed the conversational AI software ChatGPT upon the world in 2022, the wider population began to see the potential of AI for mental health, even ChatGPT therapy, and the creation of AI tools for therapists (Minerva & Giubilini, 2023; Nelson, 2024).
The impact of AI is proving impressive. A 2023 paper published in Information Systems Frontiers describes an AI assessment tool that is 89% accurate at identifying and classifying patients’ mental health disorders from only 28 questions—without human input (Tutun et al., 2023).
And that accuracy is only getting better. A 2024 study found that some AI tools can now identify mental health disorders with up to 100% accuracy, depending on the method used (Alkahtani et al., 2024). As these technologies continue to evolve, their potential to support early diagnosis and treatment decisions is becoming even more promising.
Further, in a recent meta-review of research into mental health conversational agents, researchers noted that chatbots have the potential to “effectively alleviate psychological distress” and even result in therapeutic relationships being formed with the AI (Li et al., 2023, p. 9).
How is AI being used in mental health care?
Evolving digital technology and AI are transforming the field of mental health in multiple areas, including (D’Alfonso, 2020; Koutsouleris et al., 2022; Thakkar et al., 2024):
Prediction and detection
AI—especially machine learning, which focuses on learning patterns and making decisions—is increasingly being used to predict and detect mental health conditions, such as Parkinson’s and suicide risk (Dooley, 2025; Govern, 2021).
Digital intervention
Web- and smartphone-based digital interventions (apps) enhance and personalize mental health care user experiences.
Digital phenotyping
Using sensor data from smartphones and other digital devices offers behavioral and mental health insights and supports the prediction of mental health conditions.
Natural language processing
Analyzing clinical texts and social media content provides a means to spot mental health states and supports the development of conversational agents for therapeutic intervention.
Emotion Regulation Support
AI-powered games, apps, and music-selection software are helping users regulate emotions for conditions like anxiety and schizophrenia.
Chatbots and virtual agents
These offer accessible therapy options for various mental health conditions, with approaches such as Cognitive-Behavioral Therapy and other therapeutic techniques.
Ecological momentary interventions
Mobile devices can support real-time psychological interventions and behavioral prompts. They frequently use user feedback and behavior to inform their highly personalized therapy recommendations.
Precision medicine in mental health
Challenges like delayed, inaccurate, or inefficient care can be eased with more precise diagnoses, prognoses, and treatment decisions. For example, researchers at UC Davis are using precision treatment approaches where AI helps recommend personalized therapies for teens with schizophrenia (Pflueger-Peters, 2020).
Ultimately, “AI is quickly becoming effective at performing several tasks in healthcare settings that we used to consider a human prerogative” (Minerva & Giubilini, 2023, p. 809).
Can AI help with mental health? - Gizmodo
What Are the Tools of AI Therapy?
Mental health treatment depends on the patient’s ability to report their cognitive and emotional states, the course of their symptoms, and input from friends, relatives, and peers (Koutsouleris et al., 2022).
AI, machine learning, and other advanced technologies offer tools that support therapists in identifying and treating mental health conditions and performing tasks that are otherwise time consuming (Koutsouleris et al., 2022; Li et al., 2023).
While there are many AI tools for therapy, the following are particularly valuable.
Chatbots and virtual agents
AI therapy chatbots, such as Tess, Wysa, and Woebot, offer “virtual psychotherapeutic services and have demonstrated promising results in reducing symptoms of depression and anxiety” and helping address mental health issues in various populations, including the elderly (Holohan & Fiske, 2021, p. 1).
Such tools are increasingly integrated into practice, offering virtual psychotherapeutic services, assisting in diagnosis, facilitating consultations, providing psychoeducation, and delivering treatment options. AI enables more personalized and adaptive responses using multiple modes of interaction, such as text and voice (Li et al., 2023).
Mobile and instant messaging integration
Conversation agents (chatbots) can be integrated with mobile or instant messaging apps to “assist with diagnosis, facilitate consultations, provide psychoeducation, and deliver treatment” (Li et al., 2023, p. 1).
Such AI use has proven effective in reducing mental health issues, including depression and distress, and is shaped by the quality of the human–AI therapeutic relationships (Li et al., 2023).
Natural language processing
Natural language processing helps analyze patient language in conversations, chats, emails, and social media posts. It can detect patterns that correlate with mental health issues, such as depression or anxiety, and is a vital element of chatbots (Holohan & Fiske, 2021; Li et al., 2023).
Machine learning models for diagnosis
Machine learning models can be used in research and practice to predict the existence and type of mental disorders. Models are trained using participants’ answers to assessment questions and other historical data (Tutun et al., 2023).
These decision support systems (DSS) can assist mental health professionals in making evidence-based treatment decisions, analyzing data, and providing recommendations for mental health diagnoses and treatments.
In one study, researchers concluded that “accurate diagnosis for mental disorders through this proposed DSS can reduce the overall healthcare cost due to misdiagnosis, overdiagnosis, and unnecessary treatment” (Tutun et al., 2023, p. 1271).
AI offers significant organizational and time-saving boosts to mental health practitioners, including (Bibhudatta, 2023):
Automatically taking notes during video meetings
Reviewing and summarizing client notes
Creating tailored exercises, activities, and interventions
Streamlining billing
Arranging meetings and managing calendars
Yet the potential of AI goes much deeper. Researchers are currently assessing the benefits of AI to transform each of the following areas and ultimately support more positive client outcomes.
The benefits of using AI in therapy include (Holohan & Fiske, 2021; Koutsouleris et al., 2022; Li et al., 2023; Prescott & Hanley, 2023; Tutun et al., 2023):
Being more accessible and convenient
24/7 availability, providing immediate support when and where needed; removing geographical, financial, and time constraints
Analyzing vast datasets
Identifying behavioral patterns in what specific groups and populations do and say across many aspects of life – far beyond human capacities
Being more cost effective
Reducing costs, such as the need for physical spaces (offices) and equipment, and limiting expenditure to software and licensing
Reducing stigma
Clients may prefer talking to an AI therapist rather than a human one; they may feel more psychologically safe and less judged, especially in the initial phases of therapy.
Being more efficient in diagnosing and monitoring
With the capacity to handle large amounts of data, AI can assist in diagnosing mental health conditions and monitoring treatment.
Perhaps above all else, the most substantial advantage AI has in therapy and supporting mental health is its ability to tailor communications, activities, feedback, and counseling to the specific needs of the client at that particular point in time (Holohan & Fiske, 2021; Tutun et al., 2023).
Can the Challenges and Criticisms Be Overcome?
While there are considerable benefits to embracing AI in therapy, there are still challenges and risks to using tools such as ChatGPT for therapy and AI tools for therapists (Minerva & Giubilini, 2023).
Concerns and questions have been raised regarding how much we should trust the advice and guidance of such systems and their potential for improving mental health outcomes (Koutsouleris et al., 2022).
Challenges and criticisms include the following (Minerva & Giubilini, 2023):
Lack of empathy
AI does not have the capacity to empathize and form genuine connections with clients, which are vital in therapy.
“It seems unlikely that AI will ever be able to empathize with a patient, relate to their emotional state, or provide the patient with the kind of connection that a human doctor can provide” (Minerva & Giubilini, 2023, p. 809).
Complexity of human psychology
Algorithms and data patterns cannot address the nuanced needs of each individual because human psychology is too complex.
Loss of patient autonomy
Overreliance on AI for mental health care could lead to clients becoming overly dependent on such tools for emotional support and decision-making, potentially reducing their ability to manage their mental health independently.
Unknown long-term effects
It is unclear how prolonged reliance on AI for mental health support could impact clients or the nature of human relationships.
Ethical and privacy concerns
There are significant ethical and privacy concerns related to the use of AI in therapy, including (Minerva & Giubilini, 2023):
Data privacy and security: AI models collect vast amounts of personal data that could be exposed or misused.
Bias and fairness: As with humans, AI can learn bias, potentially impacting the treatment provided or leading to misdiagnosis or inappropriate therapy.
Loss of personal touch
“One of the obvious costs associated with replacing a significant number of human doctors with AI is the dehumanization of healthcare” (Minerva & Giubilini, 2023, p. 809).
Introducing the world's first AI therapist
Overcoming the limitations of AI
So, how do we get around some of the limitations of AI identified above?
Good training
AI models must be trained with high-quality, unbiased data. “The development of robust predictive models starts with high-quality, reliable, and sufficiently representative data that capture both the variability, complexity, and specificity of the targeted phenomena” (Koutsouleris et al., 2022, p. 831).
Adequate data privacy and security measures
Privacy is paramount. Data must be secure against breaches and unauthorized access.
Consent and transparency
Users must be made aware of how and by whom their data will be stored, used, and accessed. Therefore, users must be helped to fully understand the implications of AI data processing.
Ensuring AI is fair and unbiased
Procedures and controls must be put in place to ensure that AI tools do not learn bias by adopting discriminatory practices or providing unequal treatment.
Accountability and liability
Clear guidelines are required for who is held accountable for negative outcomes. Is it the AI developer, therapists using the tools, or the practice owner?
Informed decision-making
AI recommendations are just that. We should use them to support rather than replace human judgment and decision-making.
Maintaining oversight
If therapists and mental health practitioners use AI tools, they must monitor how they are working and the advice and guidance they offer.
Based on the benefits and challenges we’ve seen for AI in mental health care, artificially intelligent therapists seem unlikely to replace human therapists—at least for now (Minerva & Giubilini, 2023).
And yet, AI therapist tools and software can be valuable for supporting and augmenting care provided to clients, potentially improving the quality of and access to mental health services (Minerva & Giubilini, 2023).
For the moment, the role of AI in therapy is best seen as complementary, providing additional resources and support while leaving the core therapeutic relationship and decision-making to human professionals (Koutsouleris et al., 2022). But this may be starting to shift.
In 2024, Cedars-Sinai launched Xaia, a conversational AI therapy app designed for Apple Vision Pro. It guides users through self-directed therapy sessions in calming virtual environments, led by a digital avatar trained to simulate a human therapist (Cedars-Sinai, 2024).
While still in its early stages, tools like Xaia hint at a future where AI doesn’t just support therapists but increasingly takes on therapeutic roles itself.
What Does the Future Hold for AI Therapy?
In their article “Is AI the Future of Mental Healthcare?” researchers Francesca Minerva and Alberto Giubilini (2023) identify several key points about the future of AI in mental health care.
Hybrid approach
The future of mental health care will likely involve a hybrid approach, combining the strengths of AI and human therapists.
Cost-effectiveness and accessibility
Mental health care will, as a result of AI, become more cost effective and reduce skilled staff shortages, making it more accessible globally and potentially improving care for a greater number of people.
Dehumanization concerns
Empathy and trust, core aspects of health care provision, might be at risk of being lost with the use of AI.
Potential benefits in psychiatry
Advances in AI in psychiatry might yield positive results, potentially challenging the traditional view of psychiatry as grounded in human connection.
AI in diagnosing mental illness
We will likely see further improvements in diagnosing mental illnesses by analyzing large amounts of diverse data from various sources like medical records, social media, and wearable devices.
AI for specific patient groups
AI will continue to be particularly useful for individuals who find human interaction challenging or are concerned about stigma, such as those with depression or autism.
Beyond that, the direction and capacity of AI to support mental health remain challenging to predict and easy to underestimate.
17 Top-Rated Positive Psychology Exercises for Practitioners
Expand your arsenal and impact with these 17 Positive Psychology Exercises [PDF], scientifically designed to promote human flourishing, meaning, and wellbeing.
This AI-powered app offers quick, evidence-based conversations grounded in Cognitive Behavioral Therapy (CBT) to support mental health in real time.
Youper helps users screen for common mental health conditions, track emotional patterns, and gain personalized insights, all in one easy-to-use platform.
Developed by mental health professionals, Yuna helps reduce stress, boost mood, and build confidence through personalized conversations grounded in evidence-based techniques.
With voice-to-voice chat, emotional check-ins, and adaptive sessions tailored to your day, Yuna offers a private, secure space to grow at your own pace.
We are all seeing AI’s potential to transform our world experience, including digital art, journalism, online shopping, driving, and how we interact with our technology.
Therefore, it is no surprise that AI is increasingly used in health care, supporting mental and physical health practitioners in diagnosing patients and finding the best treatments (Minerva & Giubilini, 2023).
Advanced technologies like large language models, made popular by ChatGPT’s release in 2022, are being explored for their potential in mental health care to generate sophisticated responses and interactions, supporting the mental health of those in need.
AI provides cost-effective support for clients in an overwhelmed mental health system, bridging the gap where traditional services struggle to meet the rising demand for treatment.
Although current advances in tools and technology aren’t yet ready to replace human mental health professionals, they can play a crucial role in enhancing and supplementing the care given to clients, potentially improving the standard and accessibility of mental health services (Minerva & Giubilini, 2023).
As we witness AI’s growing role in mental health care, it’s an ideal time for counselors and coaches to explore AI tools. Such innovations can enrich their practice, offering fresh approaches to support clients more effectively in today’s digital world.
What are the potential risks of relying too heavily on AI for mental health support?
Relying heavily on AI for mental health support poses risks such as loss of empathy and personal connection, reduced patient autonomy, and ethical concerns related to data privacy (Minerva & Giubilini, 2023). AI lacks the ability to form genuine therapeutic relationships, which are vital for effective mental health care.
Is AI good for mental health?
AI can be beneficial for mental health by providing accessible, cost-effective support and assisting with diagnosis and treatment monitoring (D’Alfonso, 2020). However, it should complement human therapists rather than replace them to ensure comprehensive care.
Can AI help with anxiety?
AI can help with anxiety through chatbots and apps that offer Cognitive-Behavioral Therapy techniques and personalized interventions (Li et al., 2023). These tools provide immediate support and can be effective for managing symptoms.
References
Alkahtani, H., Aldhyani, T. H., & Alqarni, A. A. (2024). Artificial intelligence models to predict disability for mental health disorders. Journal of Disability Research, 3(3).
Bibhudatta, D. (2023, November 29). Generative AI will transform virtual meetings. Harvard Business Review. https://hbr.org/2023/11/generative-ai-will-transform-virtual-meetings
Cedars-Sinai. (2024, February 2). Cedars-Sinai behavioral health app launches on Apple Vision Pro. Cedars-Sinai. https://www.cedars-sinai.org/newsroom/cedars-sinai-behavioral-health-app-launches-on-apple-vision-pro/
D’Alfonso, S. (2020). AI in mental health. Current Opinion in Psychology, 36, 112–117.
Dooley, K. (2025, April 14). UF professor develops AI tool to better assess Parkinson’s disease, other movement disorders. University of Florida News. https://news.ufl.edu/2025/04/visionmd-/
Govern, P. (2021, March 15). Artificial intelligence calculates suicide attempt risk at VUMC. VUMC News. https://news.vumc.org/2021/03/15/artificial-intelligence-calculates-suicide-attempt-risk-at-vumc/
Holohan, M., & Fiske, A. (2021). “Like I’m talking to a real person”: Exploring the meaning of transference for the use and design of AI-based applications in psychotherapy. Frontiers in Psychology, 12.
Koutsouleris, N., Hauser, T. U., Skvortsova, V., & De Choudhury, M. (2022). From promise to practice: Towards the realisation of AI-informed mental health care. The Lancet (British Edition), 4(11), e829–e840.
Li, H., Zhang, R., Lee, Y.-C., Kraut, R. E., & Mohr, D. C. (2023). Systematic review and meta-analysis of AI-based conversational agents for promoting mental health and wellbeing. NPJ Digital Medicine, 6(1), 236–236.
Minerva, F., & Giubilini, A. (2023). Is AI the future of mental healthcare? Topoi, 42(3), 809–817.
Mullins, J. (2005, April 20). Whatever happened to machines that think? New Scientist. https://www.newscientist.com/article/mg18624961-700-whatever-happened-to-machines-that-think/.
Nelson, J. (2024, January 20). Should you create content now that ChatGPT can? Forbes. https://www.forbes.com/sites/jamesnelson/2024/01/14/should-you-create-content-now-that-chatgpt-can/
Pflueger-Peters, N. (2020, September 11). Using AI to treat teenagers with schizophrenia. UC Davis Computer Science. https://cs.ucdavis.edu/news/using-ai-treat-teenagers-schizophrenia
Prescott, J., & Hanley, T. (2023). Therapists’ attitudes towards the use of AI in therapeutic practice: Considering the therapeutic alliance. Mental Health and Social Inclusion, 27(2), 177–185.
Thakkar, A., Gupta, A., & De Sousa, A. (2024). Artificial intelligence in positive mental health: A narrative review. Frontiers in Digital Health, 6.
Turing, A. M. (1950). Computing machinery and intelligence. Mind, 59(236), 433–460.
Tutun, S., Johnson, M. E., Ahmed, A., Albizri, A., Irgil, S., Yesilkaya, I., Ucar, E. N., Sengun, T., & Harfouche, A. (2023). An AI-based decision support system for predicting mental health disorders. Information Systems Frontiers, 25(3), 1261–1276.
Weizenbaum, J. (1976). Computer power and human reason: From judgment to calculation. Freeman.
About the author
Jeremy Sutton, Ph.D., is an experienced psychologist, coach, consultant, and psychology lecturer. He works with individuals and groups to promote resilience, mental toughness, strength-based coaching, emotional intelligence, wellbeing, and flourishing. Alongside teaching psychology at the University of Liverpool, he is an amateur endurance athlete who has completed numerous ultra-marathons and is an Ironman.