The Dangers of Turning to AI for Mental Health Support
Terms of Use Privacy Policy Hide
The Dangers of Turning to AI for Mental Health Support
Getty Images

The Dangers of Turning to AI for Mental Health Support

Chat Bots May Be Convenient But They're No Substitute for Friends or Mental Health Professionals

It’s late at night. You should be sleeping, but the crushing weight of anxiety is keeping you awake. You open ChatGPT for a safe space to vent. It validates your feelings. And you feel surprisingly better.

If you recognize yourself in this scenario, you’re not alone. 22% of Americans said that they have used a mental health chatbot, according to a survey conducted by Woebot Health — and 47% would be interested in using an AI therapist.

The top reason for using AI for mental health support? Receiving empathetic, non-judgemental support. “Many people aren’t comfortable being vulnerable with another human. Talking to AI feels safer,” says Citlali Herrera, founder and psychotherapist at Inner Worlds Psychotherapy. The cost of therapy is another factor, adds Herrera, as well as the fact that AI is available instantly.

RELATED: What Types of Therapy Are Best for Men?

But while AI can offer an immediate — and soothing — response, relying on it for emotional support is a slippery slope. From stunting your growth to exposing your data, here are some of the risks of treating AI like a therapist.


It Can Reinforce Isolation


Healing happens in connection and relationships, not in isolation. The more you confide in a chatbot, the more you risk isolating yourself. This habit can reinforce the belief that you have to handle everything on your own, which is particularly harmful for men, says Emma Giordano LMHC at Empower Your Mind Therapy:

“Men are frequently judged for expressing their emotions and being vulnerable with others. AI will never be able to give you the real-life experiences necessary to truly help you learn that there are people in the world who will understand you and support you unconditionally.”


It Can Become an Echo Chamber


AI can become an echo chamber, adds Herrera — one that validates your pain on demand but doesn’t challenge your beliefs or help you unpack deeper patterns. Unless you prompt it, of course, but part of the point of therapy is shining light on your blind spots.

“AI can only help you based on what you share, and part of therapy is about helping you understand things about yourself and your life that you are not yet aware of,” adds Giordano.

Plus, as Veronica Shelton, AI expert, co-founder and head of creative and technology at Oak Theory, notes, if your emotional literacy is low to begin with, you might not even know what to ask.


It Can Delay Necessary Professional Help


Turning to AI for mental health support can even put your mental health at risk. According to Herrera, people often use AI to research a diagnosis, something to name their pain. Without a trained clinician, this can easily lead to misdiagnosis and delay real help.

Therapists are trained to assess risk and safety. They can “catch the nuances of trauma, suicidality and abuse,” says Giordano. A delay in obtaining professional help could have devastating consequences in some cases.


It Can’t Observe Non-Verbal Cues


Your body language, facial expressions and tone reveal a lot. A skilled therapist picks up on them and uses those insights to support you better. “These forms of human behavior are essential in therapeutic treatment when a person is unwilling to share what they are thinking or is unaware of what is happening in their mind and body,” adds Giordano. As Shelton puts it, AI “can walk with you through your thoughts, but it can’t sit in your silence.”


It Can Encourage Emotional Bypassing


Shelton, who uses AI to reflect and self-regulate, also wants men to know that it’s easy to fall into the trap of intellectualizing your emotions instead of actually processing them: “You become a master of talking about your feelings without ever feeling them.”

It seems productive, but you’re not actually working through your feelings, which means they stay lodged in your body, unresolved. Over time, that emotional avoidance can lead to other problems, from chronic stress and sleep issues to feeling disconnected from yourself and others.


It Can Stunt Your Growth


Using AI for emotional support can stunt your growth. According to Giordano, it can hurt your ability to communicate and interact with others, which has a negative impact on your relationships. “Humans do not respond like AI. AI ‘companions’ will never initiate a conflict, and will apologize and change course the moment you express your discontent,” says Giordano.

The kicker? Resolving conflict is a necessary life skill. When you learn to do it productively, it can strengthen your relationships. On the other hand, when you become accustomed to talking to a chatbot that never disagrees with you, you lack the tools to cope with interpersonal challenges.


It Can Make Your Data Vulnerable


It’s easy to feel like you’re in a private bubble when it’s just you and the screen, but that sense of confidentiality is an illusion. Do you really want ChatGPT to know your deepest, darkest secrets? Ultimately, everything you share is data — and the moment you hit that enter button, you lose control over where it goes or how it’s used.

AI can be a helpful tool for self-reflection. But it’s important to treat it as such. It might feel like a safe space, but using it as a replacement for therapy comes with dangers, including keeping you emotionally stuck.

You Might Also Dig: