AI-assisted therapy increases access & efficiency in mental health care.
AI can support — but not replace — human therapists.
Privacy, bias & hallucinations present real risks and require careful oversight.
The expansion of artificial intelligence (AI) can be seen in every industry across the world.
Recent industry reports estimate that the global AI market reached approximately $757 billion in 2025 (Precedence Research, 2025).
Adoption of AI tools by businesses increased to 60% in 2025, and health care is one of the leading industries adopting these tools (Zhang et al., 2025).
As AI is integrated into society, it is offering the mental health care field new ways to support clients, practitioners, and researchers (Zhang et al., 2025).
As mental health professionals face growing demand and administrative burden, many are asking: Where does AI-assisted therapy fit within ethical, evidence-based care?
To answer that, this article will explore various AI-assisted therapy tools as well as the challenges and opportunities that come with them.
Before you continue, we thought you might like to download our five positive psychology tools for free. These engaging, science-based exercises will help you effectively deal with difficult circumstances and give you the tools to improve the resilience of your clients, students, or employees.
Artificial intelligence, or AI models, are computational frameworks capable of learning. These models gather information from data and are programmed to perform specific tasks. The most commonly used models in health care include:
Deep-learning models
Deep-learning models use neural networks to process large quantities of unstructured data. They are important in handling complicated medical data sets and help practitioners understand mental health diagnoses and treatment options (Zhou et al., 2022).
Machine-learning models
Machine-learning models (ML) learn from data to improve performance on tasks such as classification, clustering, and regression. They predict outcomes that can help solve mental health problems by applying AI technology to specific treatments (Zhou et al., 2022).
Natural language processing models
Natural language processing models (NLP) and large language models (LLMs) specialize in understanding human language. These models are what enable chatbots to analyze and respond to clients in a way that seems empathetic (Weber-Guskar, 2021).
Reinforcement-learning models
Reinforcement-learning models (RL) make chatbots and other apps more accurate over time by interacting with the environment or client and getting feedback to improve conversations (Weber-Guskar, 2021).
These AI models have led to the creation of practical tools clinicians can use to improve the speed and efficiency of work and provide more accurate and consistent support for clients.
Types of AI-Assisted Therapy Tools
There are various AI tools built upon the AI models described above that can assist professionals with daily tasks, administrative issues, and direct client care.
AI note-taking and documentation
AI tools can help practitioners with clinical documentation and progress notes. HIPAA-compliant note-taking systems can record audio of sessions and create SOAP notes as well as treatment plans. AI platforms also provide customizable templates for groups and individuals that can save time and mental energy.
Mental health apps
Mental health apps offer anonymous support between sessions to provide guidance, advice, and understanding. These apps often use evidence-based techniques such as cognitive behavioral therapy and mindfulness to assist clients with mental health issues.
Combining AI apps and chatbots with professional therapy can benefit patients and clinicians. AI-assisted therapy tools help professionals track data recorded between sessions and collaborate on care (Haque & Rubya, 2023).
Diagnosis and treatment
AI-assisted therapy tools have been designed to analyze speech, text, and even facial expressions. This can help with early detection of mental health disorders, predict treatment responses, and flag potential risks (Hadar-Shoval et al., 2024). Clinicians can use these tools to create precise and proactive care strategies for clients.
Administrative assistance
Virtual assistants can help clinicians with scheduling, billing, intake sessions, and client follow-ups.
For example, Rescue Time is an AI tool that tracks how time is spent and digitally assesses patterns to help optimize productivity. Other administrative tools can create reports on expenses and sources of income.
AI tools can help with efficiency, improve documentation and client engagement, and provide data to track everything from client progress to business expenses and income.
Download 5 Free Positive Psychology Tools
Start thriving today with 5 free tools grounded in the science of positive psychology.
Download Tools
Benefits and Risks of AI-Assisted Therapy
AI-assisted therapy opens a world of possibility and uncertainty. Chatbots use conversational AI (CAI) and are found in mobile applications.
This therapy often promises relief from depression, anxiety, and other mental health conditions (Hadar-Shoval et al., 2024). But as with any new technology, concerns about the unknown exist.
Benefits
AI-assisted therapy can increase accessibility, personalize support, and offer convenience that traditional therapy does not.
– More options for care
The potential for AI-assisted therapy to help individuals is essentially limitless. CAI can provide care 24-7, offer affordable options to vulnerable populations, and work with individuals who are resistant to therapy due to fear of stigma and judgment (Hadar-Shoval et al., 2024).
The convenience and accessibility that AI-assisted therapy offers is one of its most appealing qualities.
– Improved quality of care
Advances in AI technology in therapy can also improve the quality of care by enhancing diagnostic accuracy, empowering clients, and supporting the implementation of techniques in day-to-day life, personalizing the treatment of mental health issues (Sedlakova & Trachsel, 2023).
CAI offers effective prevention therapy and relapse prevention for individuals with mild levels of depression (Sedlakova & Trachsel, 2023). AI-assisted therapy is most effective when used as a supplemental tool alongside a human therapist.
– Increased client engagement
One review of literature on AI-assisted therapy found that the additional support of CAI improved therapeutic outcomes and kept clients engaged between sessions (Humayun et al., 2025). Clients who used CAI also demonstrated increased self-knowledge of their conditions and treatment options.
Risks
With benefits, there are also pitfalls. AI-assisted therapy is criticized for lacking human connection, holding technology bias, and creating privacy risks.
– Lack of human interaction
As amazing as AI chatbots can be, conversational AI cannot replicate true empathy or human interaction (Weber-Guskar, 2021). The therapeutic alliance is arguably one of the most critical components of effective treatment, and many of the positive effects of therapy are lost without this important human relationship.
– Unintentional bias
As AI technology is developed by humans, it can inadvertently perpetuate biases that are programmed into apps, chatbots, and other conversational devices that provide crucial feedback to clients (Weber-Guskar, 2021).
Clients seeking help for mental health issues are already in a vulnerable state, and AI technology does not have the ability to immediately self-correct or repair damage it may cause.
Consulting with ChatGPT for psychological purposes can have detrimental effects as it may guide clients to make poor decisions (Booth, 2025).
– Privacy and confidentiality concerns
One of the biggest concerns about any new technology is ensuring confidentiality and maintaining privacy and anonymity.
Digital platforms can store immense amounts of data, and creating HIPAA-compliant options that have barriers to prevent breaches continues to be critical in the expanding field of mental health (Zhang et al., 2025).
– Hallucinations
When making use of conversational AI, hallucinations happen when confidently stated outputs are fabricated, false, and unsupported by data (Booth, 2025). Outside of therapeutic settings, where a therapist cannot intervene, this can lead to harmful clinical guidance, misdiagnosis, and the reinforcement of delusions.
Can AI companions help heal loneliness? - Eugenia Kuyda
AI-Powered Mental Health Chatbots
Chatbots are AI-generated tools that can be downloaded as an app and act as a digital therapist. Many have friendly, upbeat personalities that address various emotional and behavioral challenges in mental health (Haque & Rubya, 2023).
Characteristics such as a soft tone and the ability to engage in casual conversation make chatbots feel more personal and less like a medical tool (Haque & Rubya, 2023).
Chatbots such as Ash and Youper provide emotional support through text or voice chat.
The apps listen to what clients say, provide insight on the experience through validation, and may respond with encouragement, questions, or a plan for next steps.
Most chatbots use evidence-based techniques such as cognitive behavioral therapy in their interactions (Haque & Rubya, 2023).
Chatbots generally use three different conversational flows targeting the main mental health concerns of anxiety, depression, and self-care techniques (Haque & Rubya, 2023).
Guided conversations are the most common type used by chatbots, where users are only able to reply to prompts using preset responses.
Semi-guided conversations allow users to either select from predefined options or type in text. The Woebot Health chatbot app uses this form of conversation, allowing clients to share what they are feeling in the present moment.
Chatbots like Elomia offer an open-ended conversation style and create dialogue based on the user’s input. Along with an open-ended conversation, Elomia provides affirmations, articles, books, and movie recommendations based on the conversation history.
The draw of 24-7 access in a judgment-free environment is one of the most appealing aspects of chatbots. Crisis support is also available in some chatbots, such as Wysa and Yuna AI. The apps monitor heightened emotional states and guide clients to emergency helplines, support systems, self-help techniques, and breathing exercises in case of a panic attack.
While chatbots are one type of digital app for AI therapy, there are a variety of other apps designed for mental health support and guidance.
1. Headspace
Headspace is a well-known app that focuses on mindfulness and meditation. It offers information on mental health issues and sleep hygiene, guided meditation practices, and suggestions for clients and practitioners alike.
Neurofit was created to help clients reset their nervous system. Through daily check-ins, heart rate variability measurements, balance exercises, and coaching sessions, the app aims to help decrease stress, burnout, and overwhelm.
Rosebud is an AI-powered journaling app designed to decrease symptoms of depression and anxiety. It promotes consistent self-reflection practices, structured journaling, and AI-assisted dialogue.
Mindspa on Google Play is an app designed for self-therapy. It offers courses and articles on psychology topics, mind–body practices, and an AI-assisted chatbot for discussions.
Some apps offer a comprehensive look at mental, emotional, and physical wellbeing. For example, Earkick has the ability to track everything from reported moods, sleep, exercise, menstrual cycles, heart rate, tasks completed, and environmental factors such as hours worked and time spent with friends.
In addition to recording habits and information, it also offers an AI chatbot to provide support and guidance.
Calm offers sleep meditation, breathing exercises, and narrated stories to decrease stress and improve relaxation. The app advertises its ability to decrease levels of burnout and improve sleep, leading to increased happiness and general wellbeing.
Technology in Psychology: Regulations, Privacy, Security & Ethics
Using electronic mental health technology creates concern for personal health information (PHI), privacy, and security.
Research has shown that privacy policies were lacking in 40% of paid apps, and 40% of the AI apps collected traceable data, including names, health information, and financial information (Zhang et al., 2025).
The US Department of Health and Human Services reported that approximately 295 breaches were reported in the health care sector in the first six months of 2023, which impacted 39 million individuals (Zhang et al., 2025).
Based on the heightened risks and rapid development of AI technology, it is recommended that a dedicated national governing body be created to establish and update privacy regulations (Zhang et al., 2025).
Maintaining open lines of communication with clients about data processing and their personal rights and giving them choices (for example, having them consent to audio recording sessions and uploading information) can help to build trust in new technology.
Zhang et al. (2025) also suggest ongoing training for data processors and mental health professionals about data privacy, security, protecting PHI, and identifying security threats.
Some of these apps are not developed by trained therapists or clinicians. The outputs of these applications are not necessarily vetted as suitable for mental health applications.
AI Companions and the Future
Artificial intelligence is rapidly evolving, and its applications hold tremendous potential in the fields of health care and psychology.
AI technologies offer more objective definitions of psychiatric illnesses, which could surpass the traditional framework of the DSM-5 (Cruz-Gonzalez et al., 2025).
AI technology could advance techniques through multimodal emotional recognition and machine learning to diagnose and provide personalized intervention strategies that adapt to individual patient needs (Cruz-Gonzalez et al., 2025).
Research on the effectiveness and user experience of AI technologies suggests that creating a more adaptable and tailored version of chatbots and apps could improve engagement and clinical efficacy (Haque & Rubya, 2023).
A uniform approach to mobile mental health interventions creates barriers and limitations. Incorporating personality tests or assessing values, beliefs, and daily habits could help individualize AI experiences in mental health treatment.
Future direction for AI might include more interaction between chatbots and other technologies to create seamless interventions between sessions. As AI continues to expand into the daily tasks of clinicians, regulatory bodies will need to work hard to ensure ethical integrity, patient privacy, data security, and informed consent (Haque & Rubya, 2023).
AI-Therapy Tools for Mental Health
The AI-powered journal Mindsera offers prompts, summaries, analysis, and even AI-powered artwork based on how clients express their thoughts and feelings. The journal also tracks habits and provides feedback on patterns it recognizes.
Manifest was created with Gen Z in mind and caters to personal growth and spirituality in a “modern era.” It promotes personal growth using a push-to-talk feature advertised as “a friend that is always there to guide” with empathy to help uncover emotional patterns and set goals for the future.
Life Planner helps clients track daily habits, finances, schedules/calendars, thoughts, and mood. It offers feedback to measure progress and help clients build healthy routines to decrease stress and improve productivity.
17 Top-Rated Positive Psychology Exercises for Practitioners
Expand your arsenal and impact with these 17 Positive Psychology Exercises [PDF], scientifically designed to promote human flourishing, meaning, and wellbeing.
PositivePsychology.com offers a variety of tools and resources that can help professionals and clients bridge the gap between AI and human forms of therapy or coaching.
First, for concrete examples of how AI is being integrated into psychological care, check out our article on the uses of AI in psychology.
Keeping track of which apps or forms of technology clients use and how effective they are can be helpful. This worksheet, Software Usage for Cognitive Remediation Therapy, asks clients (or the therapist) to track software use, but this can be applied to any digital platforms or AI technology. This feedback can be used to adjust treatment plans and monitor progress.
This Session Feedback Form allows clients to provide feedback on how virtual and in-person sessions are going. This form can be adapted to assess the effectiveness of specific AI tools and AI-assisted therapy and reviewed with a therapist or coach to make choices about treatment.
As empathy is often lost in AI-assisted therapy, this worksheet, Creating an Empathy Picture, can help fill the gaps. As clients learn psychological techniques and use AI tools to improve wellbeing, empathy can help them integrate these skills into real-life relationships.
With a growing demand for mental health services, AI can help bridge the gap through psychoeducation and CAI chatbots, assisting with diagnosis, treatment, and administrative assistance. The rising prevalence of anxiety, depression, stress, and burnout as well as the increase in vulnerable populations with limited access (geography and economic) make AI technology an effective option.
While AI can tailor interventions to individual needs, the integration into clinical practice must be approached with caution.
AI-assisted therapy should never replace trained clinicians, and best practices suggest these tools should always be monitored by a health care professional.
Ethical safeguards, awareness of bias, and a commitment to preserving authentic human connection are essential to ensure that technology enhances — rather than diminishes — the quality of care. In this way, AI can serve as a valuable complement to clinicians, bridging gaps while keeping humanity at the center of mental health support.
AI is shifting the landscape of mental health and is being implemented into treatment programs. AI has the potential to streamline administrative efforts and help with diagnosis, homework exercises, and help between sessions. It does not offer the empathy and human connection that therapists provide.
Can ChatGPT be a therapist?
It is dangerous to assume that ChatGPT can act as a therapist. ChatGPT is a programmed model with bias and faults. Media reports have highlighted cases where individuals relied solely on AI tools during mental health crises, with tragic outcomes (Booth, 2025).
Is AI therapy FDA approved?
The majority of AI mental health tools, including AI therapy, are not FDA approved and are considered “wellness aids” rather than clinical treatments. The FDA’s stance is that therapists and clinicians continue to play a critical role in treating mental health disorders.
References
Cruz-Gonzalez, P., Wan-Jai, A., Lam, E., Ching, I. M., Wingman, M., Hou, R., & Ngai-Man, J. (2025). Artificial intelligence in mental health care: A systematic review of diagnosis, monitoring, and intervention applications. Psychological Medicine, 55, Article e18. https://doi.org/10.1017/S0033291724003295
Hadar-Shoval, D., Asraf, K., & Mizrachi, Y. (2024). Assessing the alignment of large language models with human values for mental health integration. Journal of Medical Internet Research, 10, 196–205.
Haque, R., & Rubya, S. (2023). An overview of chatbot-based mobile mental health apps: Insights from app description and user reviews. Journal of Medical Internet Research, 11, Article e44383. https://doi.org/10.2196/44838
Humayun, A., Madwana, A., Jassan, A., & Mahmud, A. (2025). Artificial intelligence as a predictive tool for mental health status: Insights from a systematic review and meta-analysis. PLoS ONE, 20(9), 1–14. https://doi.org/10.1371/journal.pone.0332207
Sedlakova, J., & Trachsel, M. (2023). Conversational artificial intelligence in psychotherapy: A new therapeutic tool or agent? The American Journal of Bioethics, 23(5), 4–13. https://doi.org/10.1080/15265161.2022.2048739
Weber-Guskar, E. (2021). How to feel about emotionalized artificial intelligence? When robot pets holograms and chatbots become affective partners. Ethics and Information Technology, 23, 601–610. https://doi.org/10.1007/s10676-021-09598-8
Zhang, H., Mao, Y., Lin, Y., & Zhang, D. (2025). E-mental health in the age of AI: Data safety, privacy regulations and recommendations. Alpha Psychiatry, 26(3), 44279. https://doi.org/10.31083/AP44279
Zhou, S., Zhao, J., & Zhang, L. (2022). Application of artificial intelligence on psychological interventions and diagnosis. Frontiers in Psychiatry, 13, Article 811665. https://doi.org/10.3389/fpsyt.2022.811665
About the author
Dr. Melissa Madeson, Ph.D., believes in a holistic approach to mental health and wellness and uses a person-centered approach when working with clients.
Currently in full-time private practice, she uses her experience with performance psychology, teaching, and designing collegiate wellness courses and yoga therapy to address a range of specific client needs.