Gotham Gazette - Surveillance and the City: Online Bar Exam, Remote Learning Fuel In-Home Surveillance

Standardized testing has never been fun for students, but in the age of COVID-19, it’s positively nightmarish. As the pandemic forces students and teachers around the globe to stay home, every part of education has suffered, but test administration has completely collapsed. Recently, despite public outcry, more than 10,000 law school graduates sat for the bar exam, putting their careers and livelihoods in the balance. The bar exam used software that was and remains shoddy, invasive, and likely biased. 

Artificial intelligence (A.I.) testing was growing quickly in the United States even prior to COVID-19, and so were the errors. As of 2019, at least 21 states used natural language processing to grade student essays. The resulting marks certainly were artificial, but they were far from “intelligent.” Some students found themselves inexplicably moved from the honor roll to academic probation. Meanwhile, when other students realized that they were being judged by simplistic algorithms, they submitted strings of buzzword-laden nonsense and received perfect grades. 

In contrast, colleges and universities mainly relied on A.I to hunt down plagiarism prior to the pandemic, hiring third-party student surveillance platforms to look for copied language. But in the first weeks of the shutdown, newly-online institutions invested heavily in student surveillance, including behavior tracking, restrictive software, and unproven forms of A.I.. As of April, a majority of colleges and universities used student surveillance services, with nearly a quarter planning on doing so in the future, according to a survey by Educase.

Many K-12 schools are now following suit. Human proctors are being replaced in some cases with spyware that tracks test-takers, their homes, and even their loved ones. In giving test officials unprecedented access to students’ lives—both their digital devices and their physical spaces—testing officials are making students pay for their degrees with their privacy.

Remote student surveillance broadly fits into three categories. First, spyware monitors a student’s computer, identifying any other applications in use. This spyware can include a log of every keystroke and mouse click a student makes, as well as looking at a student’s existing software. Second, many schools use a proprietary “lockdown browser” that provides heavily restricted internet access. Third, and most disturbing, some educators use a student’s own webcam to conduct persistent video and audio surveillance. These recordings not only capture intimate confines of a student’s home, but they are frequently reviewed by biased computer vision software for signs of cheating.

The trend is sadly not confined to educational programs. Aspiring attorneys in New York provided a terrifying test case and a cautionary tale on how this same technology can be misused as part of government licensing exams. The New York State Board of Law Examiners used a suite of products from ExamSoft, for its first online bar exam. First, students were forced to verify their identity using ExamID, a proprietary facial recognition tool. After, they had to run ExamMonitor, deeply invasive software that records test-takers through their own camera and microphone, purporting to use artificial intelligence to detect “suspicious behavior.” 

The bar’s use of facial recognition is disturbing and must not be repeated. The technology is biased, error prone, and antithetical to the values that the legal profession claims to uphold. In some cases, employers have found that facial recognition simply would not recognize Black and transgender applicants. But facial recognition is not unique—every form of artificial intelligence is undermined by biases.

As seen with the United Kingdom’s A-levels fiasco, A.I. is shaped by countless human decisions, replicating and frequently augmenting the biases of the programmers who design the model and select its training data.

ExamSoft has released minimal information on its facial recognition and other artificial intelligence models, not even its accuracy and error rates. Examinees and members of the public have no way to evaluate the reliability of such a testing scheme or how it reflects on test-takers’ fitness to practice law.

Even without A.I., recording footage of examinees’ homes is deeply invasive, prying into their most intimate spheres of life. The risks are far from theoretical. For example, after the American Board of Surgery’s General Surgery Qualifying Exam, proctors reportedly contacted examinees after the exam on social media.

Additionally, ExamSoft’s privacy policy states that it may share data with law enforcement, transforming the bar exam into a warrantless wiretap of test-takers’ homes. It is disturbing and deeply ironic that a cohort of lawyers have been tested on the protections against government searches at the same moment they were forced to use software that eviscerates those rights.

The bar exam fiasco is a cautionary tale of what is to come. It is essential that future exams of this kind provide similar pathways such as the requested “diploma privilege,” instead of subjecting law school graduates to unwarranted, unethical, and unnecessary forms of surveillance. Such measures can mitigate fallouts from comparable testing calamities, but will be little comfort for the millions of other students still facing the threat of online testing and A.I. surveillance. 

***
Albert Fox Cahn is the founder and executive director of the Surveillance Technology Oversight Project (S.T.O.P.) at the Urban Justice Center and a fellow at the Engelberg Center for Innovation Law & Policy at N.Y.U. School of Law. He writes the monthly Surveillance and the City column at Gotham Gazette and is on Twitter @FoxCahn.

Alice Beck holds master’s degrees in International Law and International Humanitarian Law and Human Rights.