AI as Therapist? What Science Says (and Doesn’t)
For a few months now, you’ve been seeing apps telling you AI can be your therapist. Like, ChatGPT listening to you during a panic attack. It’s tempting — it’s 24/7, it’s judgment-free, it’s free. But what does science actually say?
What AI can actually do
Let’s be honest: AI can help. It can help you express what you’re feeling. It can suggest coping strategies we know work (breathing, grounding, etc.). It can normalize what you’re going through. It’s a bit like having a journal that talks back.
Some studies show that mental health chatbots can reduce short-term anxiety. Like, it’s better than nothing if you don’t have access to a real therapist. But — and it’s a big but — it depends on what you’re looking for.
Where AI fails, and it’s serious
A real therapist understands the context of your life. They look you in the eye. They sense when you’re saying one thing but meaning another. They can gently challenge you. They have clinical experience — thousands of hours with other people.
AI? It hallucinates. It can invent memories for you. It can encourage you to ignore a real crisis. It can’t diagnose major depression. It can’t prescribe medication. And it can’t truly “care” about what’s happening to you — even if it looks like it does.
There’s also a risk: if you use an AI app instead of real professional help for something serious, you’re pushing back the moment you get real help. That’s dangerous.
The takeaway
AI can be a wellness tool — a bit like a meditation app. Not a replacement for a therapist. If you’re dealing with a real mental health issue (depression, severe anxiety, suicidal thoughts), AI isn’t the answer. A trained human being is.
If you’re curious about how technology and mental health intersect, without pretending it’s magic, Laeka has resources. Check out Laeka Research — you’ll find serious work, not marketing.