Research: 1 in 8 U.S. adolescents and young adults use AI chatbots for mental health

A quiet shift is underway in how young people seek comfort when their minds feel heavy. A new national study in JAMA Network Open finds that about 1 in 8 adolescents and young adults in the United States already use AI chatbots for mental health guidance. Many lean on tools like ChatGPT when sadness, anger, or worry creep in. As one researcher put it, "I find those rates remarkably high."

This is not a story about the distant future. It is happening right now, in bedrooms, on school buses, and through glowing screens late at night.

How does it work?

Researchers surveyed 1058 people between 12 and 21 years old during February and March of 2025. Their questions were simple. Do young people use generative AI for help when they feel overwhelmed? How often? Do they find the advice useful?

The answers paint a clear picture:

  • About one in eight respondents used AI chatbots for mental health advice.
  • Among those users, two out of three engaged with the tools at least once a month.
  • More than 93 percent said the advice felt helpful.
  • Usage climbed among young adults, where roughly one in five respondents aged 18 to 21 reported using large language models for support.

These tools draw people in because they are free, immediate, and private. They feel like a whisper behind a closed door. No appointments. No waiting rooms. No judgment. As lead researcher Jonathan Cantor noted, "There are few standardized benchmarks for evaluating mental health advice offered by AI chatbots," yet young people still flock to them.

Why does it matter?

The United States remains in the thick of a youth mental health crisis. Nearly one in five adolescents has experienced a major depressive episode in the past year, and almost 40 percent received no mental health care at all. Against that backdrop, it makes sense that young people might reach for whatever feels available.

But availability is not the same as safety. Researchers warn that while many young people say chatbot advice helps, we still do not know much about the quality of that guidance. "Obviously the key question is how can LLMs be most helpful but at the same time limit their harm," said one of the coauthors. Their worry is not abstract. OpenAI currently faces several lawsuits claiming that ChatGPT contributed to delusions and even suicide.

The study also uncovered racial gaps. Black respondents were less likely to say chatbot advice was helpful, suggesting possible shortcomings in these systems' cultural understanding.

The context

For years, experts suspected that young people were quietly using AI tools to talk through their emotions. The proof was thin. Then came this study, led by Jonathan Cantor at RAND, with collaborators from the Brown University School of Public Health and Harvard Medical School. As Ateev Mehrotra explained, "There has been a lot of discussion that adolescents were using ChatGPT for mental health advice, but to our knowledge, no one had ever quantified how common this was."

Now we know. And the numbers are not modest. They suggest a world where teenagers and young adults slip between human support and machine guidance as easily as they move between apps. The survey did not track whether the advice was tied to diagnosed mental illness, which leaves big questions about how vulnerable groups are affected.

Still, the message is loud and clear. Youth are not waiting for guardrails before they use generative AI for emotional support. As Mehrotra put it with some urgency, this research "changes my thinking from adolescents might use AI in the future and emphasizes this is already extremely common."

source

💡Did you know?

You can take your DHArab experience to the next level with our Premium Membership.
👉 Click here to learn more