Mental health center in Jeddah unveils robot to support addiction recovery

In Jeddah, a new kind of healer just joined the fight against addiction — and he's clad in traditional Saudi attire. Meet Raseen, a humanoid robot unveiled by the Eradah Mental Health Complex. Unlike the clunky automatons that shuffle trays or fetch coffee, Raseen is built for something far more intimate: listening, teaching, and supporting people on their recovery journeys. His very name comes from the Arabic word for sobriety, a nod to balance — of body, mind, society, and spirit.
As Dr. Khalid Al-Oufi, Eradah's general supervisor, psychiatrist, and head of the addiction division at the Saudi Psychiatric Association, put it:
"Raseen participates in international days, forums and conferences related to mental health and addiction... he privately listens to people's concerns about mental health and addiction and provides clear, detailed responses."
How does it work?
At first glance, Raseen could pass for a well-dressed host at a community event. But behind the robes lies a digital assistant powered by advanced AI algorithms. He doesn't just spit out canned phrases. Instead:
- He listens to questions, processes them, and replies with scientifically accurate, contextually relevant answers.
- He educates patients during therapy sessions, explaining complex topics like the causes of addiction, relapse triggers, and available treatments.
- He ventures into schools, sports clubs, and even walking tracks, spreading awareness and offering one-on-one private chats about mental health.
His range goes beyond patients. Within hospital walls, Raseen briefs staff on their rights, workplace policies, and even Saudi Arabia's Vision 2030 transformation. He's not just a helper — he's a trainer, an educator, and, in some ways, an ambassador for digital health.
Why does it matter?
Mental health care often struggles with two hurdles: access and stigma. A robot in traditional dress that talks openly about psychiatric services? That's a bridge. People who might shy away from a human counselor could lean into a conversation with Raseen, testing the waters before seeking formal care.
But there's more. The National Centre for Mental Health Promotion (NCMHP) underscores both the promise and the pitfalls. AI, they argue, could supercharge diagnosis and treatment. But overreliance risks eroding the very thing therapy depends on — human connection. And then there's the tightrope of ethics:
- Privacy: Safeguarding sensitive patient data is non-negotiable.
- Bias: Algorithms can carry hidden prejudices, skewing outcomes.
- Trust: A single misstep could spark widespread skepticism.
As NCMHP experts stress, "striking a careful and ethical balance between the benefits and challenges of AI" is the only way forward.
The context
Saudi Arabia's health sector is in the midst of a sweeping transformation under Vision 2030. The push is toward digital health, efficiency, and broader access to care. Raseen slots neatly into that strategy — a symbol of ambition and a test case for how far AI can go in mental health.
Globally, the field is racing ahead. AI-driven mental health apps and chatbots are multiplying, with forecasts suggesting healthcare-focused AI tools will expand fivefold by 2035. For some, that's thrilling. For others, unsettling. Robots like Raseen sit right at the intersection of hope and hesitation.
And yet, there's something quietly powerful here: a machine, draped in familiar garments, reminding a community that asking for help is nothing to be ashamed of. In a country where conversations about mental health are only recently stepping out of the shadows, that symbolism may matter just as much as the technology itself.
💡Did you know?
You can take your DHArab experience to the next level with our Premium Membership.👉 Click here to learn more
🛠️Featured tool
Easy-Peasy
An all-in-one AI tool offering the ability to build no-code AI Bots, create articles & social media posts, convert text into natural speech in 40+ languages, create and edit images, generate videos, and more.
👉 Click here to learn more
