AI and Therapy: What’s Helpful and What’s Concerning
Like it or not, artificial intelligence (AI) is everywhere these days. It’s becoming rapidly integrated into our daily lives, and mental health is no exception. From people using AI-generated journal prompts to confiding in therapy chatbots, we’re in the early stages of exploring how digital tools can support emotional well-being.
Let’s be clear, some of these tools can be helpful. However, it’s equally important to be mindful of the very real risks.
At Boreal Therapy Collective, we pride ourselves on offering evidence-based, dynamic support to a range of populations, including children, teens, adults, couples, and families. Across these demographics, we’re noticing an increasing number of people bringing up AI in sessions. While we’re not anti-technology, we believe it’s essential to separate clinical reality from digital hype. All mental health support is nuanced, and AI, despite its undeniable speed and accessibility, can’t fully replace the complexity of therapeutic relationships.
Where AI Can Help: Coping Skills, Structure, and Psychoeducation
AI is likely here to stay, and it will continue creating changes within mental health treatment. You may already be experimenting with AI to learn more about various symptoms. If you're already in therapy, you might be using AI to augment your treatment. We believe that AI can be helpful in offering the following benefits:
Coping Skills
If you’re seeking straightforward techniques, like breathing exercises, emotion regulation strategies, or guidance on setting boundaries, AI can often provide helpful suggestions. It can offer clear lists of coping tools and examples of cognitive reframing, drawing from established therapeutic approaches such as CBT and DBT. Because these methods are well-documented, AI can efficiently deliver practical resources. This makes it a useful starting point for individuals looking to build basic skills or manage everyday stressors.
Organizing Thoughts or Journaling
Many people use AI as a kind of journaling companion. You can prompt it to help identify cognitive distortions, write about emotional patterns, debrief shameful struggles, or reflect on what’s going well. You can even ask it to generate gratitude prompts or encouragement quotes based on your mood. For some, this can enhance self-awareness or keep them feeling emotionally engaged between sessions.
Understanding the Basics of Mental Health
Do you want to know more about the window of tolerance? Are you looking for a simplified explanation of trauma bonding? Do you wish to understand more about the subtle symptoms of ADHD?
AI can summarize this information by synthesizing material from millions of sources in seconds. It can also adjust tone, rewrite in simpler language, or translate into different languages. That kind of educational support can be genuinely useful, especially if you're trying to gain more understanding of certain mental health conditions.
AI and Therapy Risks: The Illusion of Empathy and the Limits of Language
No matter how human-like technology becomes, it’s so important to remember that AI doesn’t have any lived experience or formal education. It doesn’t have a nervous system, intuition, and it certainly hasn’t authentically sat with another human being and felt their pain. So even when it mimics therapeutic language, it isn’t actually doing what therapists do.
Here are some of the main problems that happen when people begin using AI for therapy:
It Can Validate in a Misattuned Way
If you want to consistently feel good about yourself or your choices, AI is a great solution for accessing continuous praise. If you say that you feel hurt, for example, AI will likely affirm your feelings and validate why you feel hurt.
And, while this may seem beneficial (we can all benefit from some compassion!), it can quickly become problematic. For example, it might agree that you should “cut off” a friend for one misstep or “never trust anyone again” after a slight disappointment. While we all need validation, we also need nuance, insight, and accountability. Otherwise, we’re just being enabled, and that doesn’t foster much resilience for change. Good therapy aims to balance all of these delicate needs.
It Can Reinforce Black-and-White Thinking
Because AI tries to be agreeable and responsive, it sometimes reinforces all-or-nothing thoughts. It may offer extreme solutions or fail to challenge cognitive distortions. And, if you're dysregulated, this kind of interaction can amplify reactivity instead of calming it.
Unlike trained therapists, AI does not have the same capacity to read emotional context or pick up on subtle relational cues. For example, if you're venting about a challenging conflict, AI might offer soothing words, without helping you reflect on your own role or consider healthier communication strategies. This creates the illusion of support, but it doesn't always promote self-awareness or long-term growth.
It Doesn’t Know When You’re in Crisis
AI therapy is not trauma-informed, no matter how convincing it sounds. It can’t track your body language, pick up on your tone, or notice when your eyes glaze over. It doesn’t hear your voice tremble or recognize when you’re starting to spiral or shut down. It won’t necessarily pick up on the risk associated with self-harm or the intensity of a visceral hallucination. In fact, AI consistently comes under fire for its inability to intervene with hallucinations, delusions, and other crisis symptoms appropriately.
This is because those subtle crisis cues, which are so essential in therapy, aren’t part of the AI equation because they can't be. While AI might suggest calling a crisis line or taking a deep breath, it lacks bringing human intuition into treating various mental health symptoms.
It Can't Offer a True Connection
Technology gives us more opportunities for connection than ever before. And yet so many people feel profoundly disconnected from the world around them.
While therapy is sometimes about offering skills and solutions, the bulk of therapy isn’t just about talk. It’s about processing, witnessing, attunement, safety, new perspectives, and relational care.
While it’s true that machine learning can simulate tone, it can't share space with you in the same way a person can. It will never have that ability. It can never authentically pause to ask, “Are you really okay?” in a way that feels attuned or safe. This absence of emotional presence makes a real difference, especially for people navigating deep pain or instability.
Therapy for Depression, Anxiety, and Other Mental Health Concerns
The rise of AI is pushing the field of therapy to evolve. At Boreal Therapy Collective, we don’t believe in fearmongering or pretending AI doesn’t exist. But we also believe that mental health work is sacred, relational, and deeply embodied. No chatbot can replace that.
If you’re struggling with your mental health, we would be honoured to bring our expertise and our humanness to support you. We specialize in therapy for depression, anxiety disorders, trauma, and more. Contact us today to book your initial assessment.
And, if private therapy is not an option, please check our blog that details all therapeutic options (both low-cost and no-cost) in Fort McMurray.
