The Daily Beast - How Tech Treats Students With Disabilities Like Criminals

By Evan Enzer and Sarah Roth

When the Americans with Disabilities Act (ADA) went into effect 32 years ago, there was optimism that technology could close the education gap for students with disabilities and other special needs. The ADA went far beyond visible disabilities, promising life-changing protections to the neurodivergent.

We, as neurodivergent people, know how educational technology can change lives—and how word processors, spellcheck, and self-paced learning can let our brains thrive in ways traditional schooling never could. But we also see how emerging technology threatens to do the reverse, making school a harsher, less accessible environment.

Today, schools across the country increasingly turn to techno-solutionist tools that harm students with invisible disabilities. Crude risk assessment tools mistake neurodivergence as a harm to ourselves and others. Social media monitors evaluate posts about mental health, and penalize students who need psychological evaluations as part of their individualized learning assessment.

Remote and computer proctoring programs with biometric monitoring capabilities have become a mainstay during the COVID pandemic. These programs flag students for cheating when they look away from their screens or make other “suspicious” movements. This harbors real danger for people with disabilities. The vocal and facial expressions of a student with a disability may differ from the “normal” baseline that a software program compares the student to—mislabeling their affect and singling them out for discipline.

In many cases, remote proctoring programs do not even try to accommodate disabilities—denying test-takers bathroom breaks, time away from their computer screen, scratch paper, and dictation software. This exacerbates disabilities, causes stress, and forces test takers to rush through the most important tests of their lives.

This monitoring drives neurodivergent students into the shadows, deterring them from sharing their feelings, degrading their mental health, and reducing their willingness to seek help.

Seeking cognitive evaluations and speaking openly about mental health should be encouraged as a healthy behavior, not punished. Like many with learning disabilities, we remember driving from therapist to therapist, assessment to assessment, desperately trying to uncover the correct diagnosis. We remember the sting and stigma when teachers singled us out for our spelling, reading, or being unable to sit still.

And we’re far from alone.

Over 20 percent of Americans have a mental illness, and around 10 percent have a learning disability. For nearly every one of us, neurodivergence is nothing to be concerned about, but school surveillance technology treats our differences as a threat. Much like the shame we felt when teachers singled us out, it hurts students when surveillance tech targets neurodivergence.

Rather than being some magic crystal ball, the algorithms used by schools represent little more than bias in a box. These algorithms crudely decide who is and is not “normal,” punishing students simply because their brains act differently. But the injustice doesn’t end there.

Making things worse, there’s been an explosion of biometric policing technology in the last 30 years, and the same tools that police use in public are working their way into classrooms.

For example, emotion recognition and attention-detecting software monitor students’ biometric information (i.e. motor and vocal tics) and then compare it to a trend line of behavior considered “normal” or favorable in a problematic attempt to track students’ emotions and focus.

Some EdTech software already includes this technology. In 2017, a French school introduced the Nestor EdTech platform into its classes; the program comes equipped with attention monitoring capabilities. And in April, Zoom considered adopting emotion artificial intelligence (AI) into its platform, which educators heavily rely on for remote learning.

We are no strangers to the harmful effects of remote and computer proctoring.

We have hurried through exams because our proctoring software did not allow bathroom breaks. We’ve experienced increased anxiety that biometric monitoring software would flag our uncoordinated eye movements, auditory processing habits, fidgeting, and uncontrollable twitches as “cheating.” We have been told to sit in exams for inordinate amounts of time, up to 10 hours a day for two days. And we’ve chosen not to seek accommodations for important tests or not to sit for those exams at all, because the disability accommodations process was overly burdensome. In some cases, this affected our educational choices and cost job opportunities.

Thirty-two years later, the full promise of the ADA remains unfulfilled. Even worse, civil rights protections appear only to be falling further behind.

As we look to the decades ahead, lawmakers and regulators cannot simply rest on their laurels. Those in power have no excuse for ignoring the threat, and those designing technology have no excuse for ignoring how their tools can negatively affect people experiencing disabilities. We need protections for the algorithmic age—a new set of ADA safeguards that protect students from the ever-evolving barriers to public life.

communications staff