Title image

This report was supported in part by grants from Borealis Philanthropy Disability Inclusion Fund and the John D. and Catherine T. MacArthur Foundation.

We extend a special thanks to Public Interest Privacy Center President Amelia Vance and an anonymous reviewer for their feedback.

EXECUTIVE SUMMARY

  • Schools increasingly turn to spyware, noise detectors, and other invasive mental health prediction tools, with predictably poor results. These error-prone systems flag non-existent crises and miss real dangers.

  • Mental health surveillance alienates students, making it more likely that they will self-censor and isolating them from teachers and online mental health resources.

  • Student spyware routinely outs LGBTQ+ students and puts BIPOC youth at risk of police encounters.

  • New tech appears to be displacing evidence-based mental health screening.


I.     Introduction

School surveillance is nothing new. Cameras have long watched over hallways and classrooms.[1] Students routinely run a gambit of metal detectors. But now, tech vendors are selling unproven and federally funded tools that claim to assess students’ states of mind and predict threats that students pose to themselves or others.[2] But psychological surveillance impedes effective mental health interventions; outs LGBTQ+ students; and gives police pretext to enter our schools and homes. Laptop-scanning spyware can invent mental health crises,[3] transforming a coach’s email about basketball shooting drills into a threat of firearms attack.[4] Noise detectors wiretap school buildings to find threats, but often only call police and school officials to the site of laughter and slamming lockers.[5] U.S. students’ mental health crisis is real,[6] but psychological surveillance solutions are fake. Sadly, funding is flowing to these invasive, error-prone systems and displacing evidence-based mental health screenings, treatment, and support.[7]

II.   What Students Need vs. What They Get

One in five U.S. students struggles with a significant mental health problem by high school.[8] Most don’t receive mental health services,[9] but those who do most often receive services at school.[10] To identify students who need support, best practices suggest universal screening—evaluating the needs of every student—using a proven tool suited to a school’s student body.[11] Thereafter, screening results should be interpreted in light of students’ personal histories and unique circumstances.[12] Effective screenings require care, transparency, and practices that foster honest answers to deeply personal questions.[13] Practitioners must take pains to respect students’ agency, privacy, and overall sense of safety. These thoughtful, bespoke forms of care stand in stark contrast to “one-size-fits-none” methodology of automated psychological surveillance.

Student Spyware’s Gotcha Approach

A student googles “Why am I depressed?” while an administrator secretly watches their search activity.[14] This is an advertisement for Securly, software that surveils “all student activity across email, documents, social media, and web browsing.” Securly’s secretive algorithm scores students on the number and severity of supposed oversteps they make by using words or viewing materials that the company flags as supposedly problematic.[15] It notifies administrators and police if students’ scores exceed a seemingly arbitrary level.[16] While the company offers limited parental education materials, many schools opt out of giving any notice.[17] Even worse, the company completely hides what it considers to be the most problematic terms and materials, making meaningful bias evaluation impossible.[18]

“More broadly, this surveillance chills students’ digital lives: not just what they say publicly, but what they allow themselves to Google in the safety of their own rooms.”

Securly is just one of countless student monitoring services, including Gaggle, GoGuardian, Social Sentinel, and Bark, that secretly send students’ computing activity to administrators without notifying students.[19] Gaggle reported thirteen-year-old transgender student for reflecting on a suicide attempt and subsequent therapy in a school writing assignment.[20] The psychological surveillance software retraumatized the student after school officials “freaked out” about his nonexistent mental health crisis.[21] “I was trying to be vulnerable with this teacher and be like, ‘Hey, here’s a thing that’s important to me because you asked,” he later reflected.[22]

This is not an isolated incident. A United States Senate report warns that spyware-induced “[e]scalations and mischaracterizations of crises” can hurt students in the long run “due to stigmatization and differential treatment following even a false report.”[23] When psychological surveillance tools hallucinate crises, they worsen students’ real struggles. More broadly, this surveillance chills students’ digital lives: not just what they say publicly, but what they allow themselves to Google in the safety of their own rooms.[24]

Spying on Students, Badly

Student spyware frequently misunderstand students’ states of mind. This isn’t surprising: spyware companies don’t follow best practices for mental health screening. Instead, they appear to err on the side of caution and flag too many students. Some even have a financial incentive to do so—Gaggle sells online counseling.[25] But tools’ tendency to “overdo it” renders them useless. As one school administrator put it:

“We tried Bark for self-harm notifications. We got tons of alerts, but they were all false positives. Even at its lowest level if a student searched for random things, it would flag them.”[26]

As another teacher put it, “You’re going to get 25,000 emails saying that a student dropped an Fbomb in a chat.”[27]

Student spyware doesn’t understand context, and it doesn’t appear to try. One school attempted to use GoGuardian to stop students from searching for the term “ass” on school computers.[28] Instead of blocking students from using an inappropriate term, GoGuardian flagged words like “class” and “assignment.”[29] In another school, GoGuardian reported any student writing the word “toy.”[30] Social Sentinel flagged a basketball coach’s message to players as portending violence. The coach’s offense? “Workouts are Tuesday and Wednesday after school. Friday morning is shooting.”[31] The list goes on: Gaggle reprimanded a student editor for receiving another student’s email submission.[32] Bark deemed “snow,” the purple heart emoji, “pen” and “school bus” to be drug slang.[33] As one student journalist reported, BTS fans are out of luck (fans of the Korean pop sensation favor purple hearts).[34] More recent editions of Bark’s “Teen Speak Code” for parents note that emojis that can be innocuous can also have inappropriate meanings: a taco emoji, for example, may refer to genitals.[35]

But when companies like Bark err on the side of over-flagging, they leave it to schools to sort out sexual references from Mexican food. In one large survey, 65% of teachers reported being tasked with following up on alerts from student spyware, despite being overburdened with their actual jobs.[36] Students and administrators even report that student spyware interferes with classwork, wrongly blocking research sources and flagging writing assignments.[37] GoGuardian has flagged educational websites about Black History Month, literary classics, Bush v. Gore, and the Bible for containing keywords the company associates with sexual content.[38]

Student spyware also appears to misinterpret marginalized students disproportionately. Securly has wrongly labeled informational LGBTQ+ news and health websites as “pornography.”[39] Until recently, simply saying “gay” or “lesbian” within Gaggle or GoGuardian’s earshot was a no-go: the companies flagged the words as indicators that a health or safety intervention might be necessary.[40] Parents and organizations criticized this policy, and a Gaggle insider even revealed that the company’s policy of flagging LGBTQ+-related terms had routinely outed students.[41] (A third of LGBTQ+ students confirmed that internet monitoring had outed them or someone they know.[42]) Initially, Gaggle doubled down on its homophobic policy.[43] It eventually relented, but it never apologized for outing students.[44] (LGBTQ+ youth’s risk of homelessness skyrockets when they are outed,[45] as does transgender youth’s risk of abuse.[46]) GoGuardian and Gaggle continue to flag normal conversations about sex, putting teens exploring their sexuality at risk of humiliation and worse.[47]

“Absent a crisis, activating the police wastes officers’ time while still putting students at serious risk of police violence.”

Calling the Cops

Student spyware routinely notifies (and effectively dispatches) police officers to students suspected of being in crisis—especially after hours, when teachers are off work and three quarters of alerts come in.[48] Nearly half of teachers in one large scale study reported student spyware alerts being relayed to law enforcement.[49] Baltimore public schools provide a case in point. By day, GoGuardian notifies school counselors and social workers when students are flagged as at risk of self-harm.[50] By night and on weekends, those alerts go straight to the Baltimore police.[51] Instead of encountering a trained mental health professional, students are met by police officers who show up armed and unannounced at their homes.[52] At the University of Connecticut, Social Sentinel updates campus police, not mental health workers, when spyware flags students at risk of self-harm.[53] In Florida, legislators mandated a police database of student spyware records and other school records.[54]

Police aren’t trained mental health professionals. They can and do routinely make mental health crises worse.[55] The New York City Police Department (“NYPD”), for example, routinely uses force in encounters with students: it handcuffs 10% of children during “child in crisis” interventions.[56] Worse yet, almost 95% of the NYPD’s child in crisis interventions involve BIPOC students despite their heightened susceptibility to police violence.[57] Police interactions also increase students’ risk of subsequent involvement with the criminal justice system.[58] And police involvement in mental health crises can even be deadly. In 2020, Salt Lake City police shot a 13-year-old with autism after his mother called 911 for a mental health informed intervention, leaving him with life-threatening injuries and a long-term physical disability.[59]

Absent a crisis, activating the police wastes officers’ time while still putting students at serious risk of police violence. Social Sentinel alerted San Antonio school administrators that a student posed a safety threat after she posted hyperbolic claims about a TV sitcom to social media.[60] School administrators weren’t fooled by “if you dont fall in love with jess again im gonna kill you.”[61] After school hours, police are left to sort through such messages—and possibly arrive at students’ doorsteps. This has occurred before: Baltimore Police admitted multiple visits to student homes for “wellness checks” in response to student spyware flags.[62] The danger of sending police to the homes of Baltimore public school students (90% of whom are Black or Latinx) can’t be overstated.

Spyware Meets ChatGPT: Tutoring Bots and Possible Entrapment

ChatGPT, a generative artificial intelligence tool that can perform internet research and draft papers, worried teachers when students started using the tool to do their homework.[63] Now, some schools are adopting a tutoring program that uses the same technology. Khan Academy’s tutoring chatbot, Khanmigo, uses generative AI to answer students’ questions.[64] It also flags students who discuss self-harm and other forbidden topics,[65] introducing the same range of concerns associated with traditional student spyware, from escalating imagined mental health crises to botching the response to students who need support.[66] As a rule of thumb, Khanmigo suggests not saying anything to it that wouldn’t be appropriate within earshot of an elementary school classroom.[67] It sends the entire log of off-limit chats to every one of a flagged student’s teachers and guardians.[68] One can just imagine the chatbot unthinkingly entrapping high school students. Khanmigo: “What are three main themes of Dante’s Inferno?” Student: “hopelessness, torture, and hell.” Khanmigo: gotcha.

III.  More of What Students Get: Bogus Threat Detectors

Noise Detectors

Tech vendors take advantage of school communities’ understandable anxiety about gun violence by selling tools that amount to overhyped noise detectors. Some these noise detectors claim to compare a room’s sounds to the known “sounds patterns” of emotions like aggression and fear; they send alerts to school administrators or police when they make a match.[69] Other noise detectors claim to learn the normal sound level of a room and send an alert when they sense sounds above the norm, potentially indicating aggression or an incident between students.[70]

“Louroe’s so-called ‘aggression- and gunshot-detection’ microphones were confused by the sound of children slamming lockers in a Nevada elementary school.”

But schools are unpredictably noisy, without “normal” sound levels, and noise detectors can’t cope. Louroe’s so-called “aggression- and gunshot-detection” microphones were confused by the sound of children slamming lockers in a Nevada elementary school.[71] A test of Sound Intelligence, the software inside of Louroe devices and other noise detectors, showed that it falsely flagged normal school sounds like singing, laughing, coughing, and energetic speech, but didn’t flag the sound of students screaming.[72] Noise detectors can be tuned and retuned in an attempt to get it right, but as it stands, they simply don’t work.

Despite these flaws, some noise detectors are designed to work with schools’ public announcement (“PA”) systems. One Ohio charter school paired a Sound Intelligence noise detector with an Axis Communications sound system.[73] When a loud sound triggers the noise detector, the PA system plays an automated warning message over the room’s loudspeaker.[74] Cue a school choir, or playful students laughing. “You must calm down or risk having law enforcement contacted. You must calm down.”[75] This is a system that is useless and worse: schools and police may overreact or, alternatively, ignore the system that cries wolf.

Noise-detecting hardware needlessly invites a dangerous police presence into schools. Schools from San Diego to Georgia give “administrators, campus security or police officer[s]” permission to monitor and respond to threat-detector alerts.[76] Other schools around the country have installed hardware that can automatically alert police to perceived threats.[77] But police responses push students into early encounters with the criminal legal system.

Threat Detectors Meet Disproven Psychology

In an online classroom based in Hong Kong, students attend class while software tries to read their faces: who is learning? Who is bored? Who is angry, even dangerous?[78] So-called “emotion recognition” has made inroads in the U.S., too: Zoom nearly added the tool to its platform in 2022,[79] and Intel tested emotion-recognition for schools.[80]

But emotion recognition doesn’t work. These tools, which claim to detect when students are bored, distracted, confused, frustrated, or angry, can’t do anything of the sort: they are based on junk science with documented discriminatory effects.[81] It is common for emotional recognition tools to overattribute negative emotions to Black people.[82] It is common for them to fail more often than they succeed, because people use expected expressions (like smiling when they are happy) only 20% to 30% of the time.[83] People’s emotional indicators—their body language, their tone of voice—vary across cultures and for people with neurodivergences and disabilities.[84] A student’s facial tics, for example, might lead to an emotion recognition system wrongly determining that the student is distracted and disengaged from a lesson, or cheating on an exam.[85] Educators and administrators may choose to respond to this dubious information by punishing students or identifying them, in the words of one emotion detection developer, as “problem pupils.”[86] AI companies abroad have already begun developing emotion recognition systems for the expressed purpose of assessing the mental wellbeing and threat level of students.[87] Fortunately, bogus emotion-detecting tech is still rare in schools and should stay that way.

IV. The Youth Mental Health Crisis and the True Cost of Student Spyware and Threat Detectors

As U.S. Surgeon General Vivek Murthy puts it, the state of American children’s mental health constitutes a public health crisis.[88] The Center for Disease Control surveys a nationally representative sample of high school students every two years. In 2021, 42% of high school students reported persistent feelings of sadness or hopelessness.[89] This astonishing figure cannot be written off as a temporary effect of the COVID-19 pandemic: 37% of students already reported persistent sadness or hopelessness in 2019.[90] 18% of students made a suicide plan and 10%—one in ten students— attempted suicide in 2021.[91] Students’ self-reports are confirmed by pediatric emergency departments, which report an increase in drug-poisoning and self-harm-related visits.[92]

In the face of this public health crisis, students need proven, efficacious, humanely administered behavioral and mental health screenings and services—not expensive tech toys that treat students as safety threats. Bogus school-safety tech is expensive. Contracts reveal costs of tens or hundreds of thousands of dollars per year.[93] This money would be better spent on increasing teacher salaries, keeping class sizes small, and other endeavors that improve student-teacher bonds—a key factor in ensuring students in need of help are not overlooked and receive support earlier rather than later.[94] Cost and opportunity costs aside, spying hurts students. It manufactures crises where they don’t exist, botches the responses to students’ real mental health needs, outs LGBTQ+ students, and dispatches police where they don’t belong. In a world where most of us acquire the information we need online, spying cuts students off from resources: students report they avoid researching mental health online for fear of monitoring.[95] For students’ sake, psychological surveillance has got to go.

Social Media Graphics