Schools Implement AI Counselors to Monitor Students’ Mental Health
As hundreds of schools adopt automated monitoring tools, educators report that students sometimes find it easier to communicate with chatbots than with human counselors.
Produced in partnership with EdSurge
Brittani Phillips, a middle school counselor in Putnam County, Florida, regularly checks her phone for alerts from an artificial intelligence-enabled therapy platform used by students outside school hours. This platform flags when a student may be at risk of harming themselves or others based on their chat inputs.
Phillips recently received a “severe” alert concerning an eighth-grade student. She spent her evening on the phone with the student’s mother, assessing the student’s vulnerability, and also contacted the police. Phillips informs students that their chats are confidential unless safety concerns arise.
This incident occurred last spring. Phillips reports,
“He’s alive and well. He’s in ninth grade this year.”She believes this interaction fostered trust between her and the family, noting that the student now greets her in the hall.
Facing budget constraints and limited mental health staff, Interlachen Jr-Sr High School, where Phillips works, utilizes an AI platform to evaluate students’ mental health needs.
Phillips’s district has employed Alongside, an automated student monitoring system, for three years. This platform exemplifies a growing category of tools marketed to K-12 schools for mental health monitoring, with at least nine companies securing funding since 2022.
Alongside states that over 200 schools across the US use its tool. The company claims its platform provides superior services compared to typical telehealth options by incorporating a social and emotional skill-building chat tool. Students interact with a llama character named Kiwi, who encourages resilience through conversations about life challenges. The AI-generated content is monitored by clinicians. Company representatives emphasize that the system offers resource-limited schools access to critical mental health support.
AI has been a significant component of the Trump administration’s agenda. However, some parents, educators, and lawmakers express caution regarding increased AI use among teens. Several states have begun legislative efforts addressing these concerns.
Experts and families worry about students forming strong attachments to AI. A recent national survey found that 20% of high schoolers have formed emotional connections with AI. There is considerable interest in preventing students from developing emotional dependencies on bots, including proposed federal legislation requiring AI companies to limit such interactions.
Phillips views the tool used in her school as effective in addressing minor issues. Supporting approximately 360 middle school students, she finds that the AI helps manage everyday problems like breakups, allowing her to focus on students nearing crisis. She also notes that some students find it easier to discuss emotional issues with AI.
On the Digital Couch
Student apprehension contributes to their comfort in confiding in AI technologies, according to school counselors.
Speaking with a mental health professional can be intimidating, especially for adolescents, explains Sarah Caliboso-Soto, a licensed clinical social worker and assistant director of clinical programs at the University of Southern California Suzanne Dworak-Peck School of Social Work. She also directs the school’s trauma recovery center and telebehavioral health online clinic.
There is a generational aspect as well. Students accustomed to chat interfaces on social media and websites may find AI interfaces familiar. Linda Charmaraman, director of the Youth, Media & Wellbeing Research Lab at Wellesley Centers for Women, notes that texting is often easier for kids than calling.
Using AI to process emotions allows students to avoid observing facial expressions, which they may fear carry judgment. Chatbots are also available outside typical appointment hours, providing immediate access without scheduling hassles, Charmaraman adds.
“It’s almost more natural than interacting with another human being,”says Caliboso-Soto.
Caliboso-Soto has observed an increase in telehealth and chat line usage. Although her clinic does not use AI, it is frequently approached by companies seeking to integrate AI as notetakers in therapy sessions.
She considers AI potentially beneficial as a “first line of defense” in resource-limited schools, regularly checking in with students and directing them to further help when necessary.
Alongside’s services start at approximately $10 per student annually, with volume discounts for larger districts.
However, Caliboso-Soto cautions against using AI as a substitute for human counselors. AI lacks the nuanced discernment clinicians provide, such as interpreting vocal inflections and body language, and may miss subtle observations.
“You can’t replace human connection, human judgment,”she emphasizes.
Charmaraman agrees that while AI can expedite diagnostics and free counselor time, overreliance is risky. AI may overlook nuances and provide unrealistic positive reinforcement. She advocates for a holistic approach involving families and caregivers.
Caliboso-Soto also warns that increased AI filtering of serious cases might reduce students’ contact with trained human professionals.
Alongside representatives clarify that their platform is not intended to replace human therapy. Ava Shropshire, a junior at Washington University and youth adviser for Alongside, states the app serves as a gateway to adult help and normalizes mental health and social-emotional learning for students.
Nonetheless, some students view AI counseling as a temporary fix.
Social Accountability
Sam Hiner, executive director of Young People’s Alliance, a North Carolina organization advocating for youth political participation, questions the current social climate:
“Can you think of another time in history when people have been so lonely, when our communities have been so weak?”
Economic upheaval, technology, and social media have contributed to student isolation, creating a strong desire for community and belonging, Hiner explains.
Students seek connection wherever possible, including through AI platforms like ChatGPT.
Young People’s Alliance has released guidelines permitting some therapeutic uses of AI technology.
However, the organization aims to rebuild human community and opposes AI use that threatens human companionship. Hiner states,
“That’s a critical aspect of therapy and of living a fulfilled life and having social connection and having mental wellbeing.”
Hiner’s primary concern is the development of “parasocial relationships,” where students form one-sided emotional attachments to AI, particularly in therapeutic contexts. He suggests AI should avoid implying it has emotions, such as saying “I’m proud of you”, to prevent fostering attachment.
While platforms claim to reduce loneliness, they often do not measure whether users become more connected or lead fulfilled lives. Hiner notes,
“All [tech platforms are] measuring is whether this bot is serving as an effective crutch for the immediate feelings of loneliness that they’re experiencing.”
Advocates seek to prevent bots from diminishing social skills by diverting people from human relationships, Hiner adds.
Pushing Boundaries
Privacy experts highlight that conversations with AI chatbots generally lack the privacy protections afforded to licensed therapist interactions. The use of these tools raises complex privacy concerns, especially when student safety and police involvement are factors, even under clinical supervision, according to a privacy law expert.
Both the company and Phillips emphasize the necessity of human oversight for these systems to be effective. Phillips considers this tool an improvement over previous district monitoring systems that directed students toward disciplinary actions rather than mental health support.
As of February this school year, Phillips noted 19 “severe” alerts from the AI tool among 393 active users. The company does not specify whether multiple alerts come from the same students; Phillips observes some students generate multiple alerts.
Phillips has learned that interpreting teenage humor requires human judgment. Some middle school boys test the system by inputting false reports, such as “my uncle touches me” or “my mom beat me with a pole”, to see if adults respond.
These students often seek attention or are amused by the reaction. Phillips assesses their body language to determine if the comments are genuine. If a student appears remorseful, the matter is addressed privately; if not, she contacts parents. Even in false cases, Phillips feels she has more options than other systems that might impose suspension.
By monitoring interactions, Phillips believes students learn she is attentive, which reduces such testing behavior annually.
This article was produced in partnership with EdSurge, a nonprofit newsroom covering education through original journalism and research. Readers can for their newsletter.







