Study: People find AI more compassionate than mental health experts

In a twist no one quite expected, people are saying that artificial intelligence sounds kinder and more understanding than human mental health pros. Yep, you read that right. A new study out of the University of Toronto found that folks consistently rated AI-generated replies as more compassionate — even when they knew they were talking to a machine.
The study's lead, Dariya Ovsyannikova, put it bluntly: "AI seems better at picking up the fine details and staying objective," which apparently goes a long way when someone's baring their soul. The research team ran four experiments involving over 500 participants, and across the board, AI came out on top for warmth, compassion, and attentiveness.
How does it work?
So what's AI doing that humans aren't?
- First off, it doesn't get tired. Unlike a therapist at the end of a 10-hour shift, AI isn't battling burnout or struggling to stay alert.
- It has an almost eerie knack for zeroing in on the emotional threads in a conversation — details a tired human might miss.
- It responds with consistency, always polite, always attentive, never rolling its eyes (even figuratively).
As Eleanor Watson, AI ethicist and faculty at Singularity University, explained, "AI can model supportive responses with a remarkable consistency and apparent empathy — something that humans struggle to maintain."
It's not that the bots are actually feeling anything. But they've been trained on mountains of data, and that gives them a toolbox of soothing words and phrases that often hit home better than a rushed or worn-out human reply.
Why does it matter?
Let's face it, mental health care is in crisis mode. Worldwide, most people who need support aren't getting it. The World Health Organization estimates over two-thirds of those with mental health issues aren't receiving care — and in low- to middle-income countries, that number jumps to 85%.
Watson points out that AI might help bridge that gap. "Machines are available. Practitioners are expensive, and their time is limited. That availability is a welcome factor."
And here's another kicker: some people just prefer talking to a machine.
- There's no fear of being judged.
- No awkward silence.
- No worry the therapist's eyes are glazing over.
Watson explains it well: "People often find dealing with a machine less daunting... especially with more sensitive topics."
The context
This study isn't saying we should ditch therapists and start pouring our hearts out to chatbots full-time. But it is throwing gasoline on the already-sparking conversation about where AI fits in human care.
The fact that 68% of participants actually preferred AI responses? That's not nothing. And with AI rated 16% more compassionate on average, it's clear this isn't a fluke.
Still, Watson offers a warning shot: "AI is so enticing we become entranced by it... flirty, insightful, enlightening, fun — it's impossible for any human being to measure up." That kind of supernormal stimulus could set the bar so high, we start expecting too much from each other.
Then there's the big shadow lurking behind it all: privacy. Mental health convos are raw. Vulnerable. Watson doesn't mince words: "Having access to people's deepest vulnerabilities and struggles makes them vulnerable to various forms of attack and demoralization."
In other words, if this kind of tech isn't handled carefully — with strict oversight and protection — what starts out as help could quickly morph into harm.
But the bottom line? AI's not here to replace therapists (at least, not yet). It's here to challenge them — and maybe, just maybe, make us all rethink what empathy really sounds like.
💡Did you know?
You can take your DHArab experience to the next level with our Premium Membership.👉 Click here to learn more
🛠️Featured tool
Easy-Peasy
An all-in-one AI tool offering the ability to build no-code AI Bots, create articles & social media posts, convert text into natural speech in 40+ languages, create and edit images, generate videos, and more.
👉 Click here to learn more
