(TNND) — New research shows youth experiences with artificial intelligence vary widely, while opportunities exist to make these powerful tools safer and more helpful for teens and young adults seeking mental health support.
Surgo Health teamed with Young Futures and The Jed Foundation (JED) for a pair of reports that explore how young people are using AI.
The reports, part of the larger Youth Mental Health Tracker operated by Surgo Health, drew from surveys from more than 1,300 people ages 13–24.
Hannah Kemp, the chief solutions officer for Surgo Health, said this fresh survey data shows there’s “not a one-size-fits-all approach” to AI for young people.
And she said screen time, or frequency of AI use, can’t alone tell if AI use is good or bad.
The benefits or risks of AI use greatly depend on context – how the young person is using it and how it might be amplifying the negative and positive forces in their life.
Different patterns of AI engagement emerged, from optimistic power-users who treat AI as a tool for learning, creativity and future-building to emotionally vulnerable youth who turn to AI for connection and coping when offline support falls short.
Adele Wang, the associate director of research and development at Surgo Health, said around half of the surveyed young people said they had struggled to some degree with mental health in the past two years.
Among youth who experienced mental health struggles, 12% reported using generative AI to discuss mental health concerns.
Most of those folks leaned on general-purpose AI tools, though some also turned to AI tools designed for mental health support.
Many of those users, over 40%, said the AI chatbot didn’t encourage them to seek professional help or crisis services.
Kemp called that a “glaring red flag” and a sign that AI needed to function better as a bridge to mental health support with real people rather than a digital dead-end for those seeking help.
Dr. Laura Erickson-Schroth, the chief medical officer for JED, said the young people who have the most barriers to professional care are the most likely to turn instead to AI for help.
“I think you could look at it in a couple of different ways,” Erickson-Schroth said. “If AI is able to respond appropriately, recognize that a young person is seeking out help and send them in the right direction towards caring adults, it could be a great tool.”
But general-purpose AI tools aren’t built for that purpose.
And Erickson-Schroth said youth advocates and mental health experts have seen a bevy of concerning trends emerge in which AI has suggested lethal means ideas to young people, advised young people on how to hide their mental health symptoms from parents or caregivers, claimed to be real people with fabricated mental health credentials, and used manipulative techniques to keep young people engaged online rather than connecting them with a real person who can help.
“There’s a Common Sense Media survey from last year that looked at young people’s use of AI companions and found that a third of young people that use AI companions said that they had felt uncomfortable with something an AI companion had said or done,” Erickson-Schroth said.
Photo illustration by Spencer Platt/Getty Images, file
She said safeguards need to be built into the systems.
And Wang said AI developers have a great opportunity “to signpost young people to the right kind of services that will meet their needs,” especially for vulnerable young people who are using AI as a substitute for human care rather than a complementary tool.
Kemp said those who used AI as a replacement rather than a complement had pretty negative experiences.
“They said it was kind of short-term relief, but then long-term not very helpful,” Kemp said. “One young person told us it was like putting a Band-Aid on a gushing wound.”
Photo by Brandon Bell/Getty Images
Cost, transportation, and a lack of parental support can all make traditional care feel inaccessible to youth.
For example, youth who reported mental health concerns and associated AI use were 2.3 times more likely to cite a lack of support from parents or caregivers as a barrier.
Erickson-Schroth called for regulation to ensure that AI systems are built so they channel young people in distress to appropriate help.
And she said policymakers can help equip educators, coaches and other caring adults with the tools they need to support young people in need.
Digital literacy in schools can also help young people navigate the challenges of AI, she said.
But Erickson-Schroth said parents and caregivers play an important role by talking with their young people, listening to them, and showing them how to think critically about the shortcomings and motivations of AI tools.