NY Daily News - The software behind the curtain: New York’s first attempt to let its people see the workings of government algorithms is woefully incomplete

“Pay no attention to the algorithm behind the curtain.” This updated warning isn’t for Dorothy, but for those who walk a far more inscrutable road through the land of municipal software and computer aided decision-making formulas. These opaque systems increasingly control much of modern life, from where our children learn, to how the streets are plowed, and even who is arrested. But at the same moment that these systems have taken center stage for city officials, becoming the focus of municipal procurement and investment, they have been hidden from view. Until now.

A NYPD surveillance camera is on New York Times Square during the coronavirus pandemic. The NYPD is using its network of surveillance cameras to make sure people are obeying rules not to congregate so the virus does not spread. (Luiz C. Ribeiro/for New York Daily News)

Last week brought us a peek behind the curtain here in New York City, but not exactly a clear view. That’s when City Hall released its first public report detailing the automated systems used by city agencies. Only there weren’t many details. The report, months in the making, listed just 16 automated systems, many of which had already seen significant public pushback.

These tools include the NYPD’s facial recognition software, which recently came under international condemnation from Amnesty International, our organization S.T.O.P. and the Ban the Scan coalition. Another equally controversial NYPD tool: ShotSpotter, which uses microphones in communities of color to supposedly detect gunshots. Only it’s unclear if it actually works, and ShotSpotter faces a lawsuit for allegedly fabricating evidence.

A NYPD sign on the Brooklyn Bridge warns people that they are under NYPD video surveillance on January 17, 2021. (Theodore Parisienne/for New York Daily News)

The most frustrating part is that this report wasn’t optional. This wasn’t information that city officials volunteered out of the goodness of their hearts. It was information they were required to report by law. Yet the city didn’t even list all the tools they use.

The missing systems run the gamut. Many are relatively benign, raising the question of why the city would even try to hide them from public view. These include the city’s eDiscovery platform, the Department of Education’s student tracking database, the city’s digital vendor management system and 311′s speech recognition tools.

But other undisclosed systems are far more concerning.

For children caught up in New York’s criminal justice system, automated tools determine much of their fate. This includes software to decide a child’s level of supervision and what outcomes to recommend to the court. New York’s probationary tools may have been better received than those that have failed in other cities, like the widely reviled COMPAS software that was found to be biased.

But no matter how good or bad these algorithms are, their consequences are far too significant to be hidden from public view. The city reached out to the public when building its probation tools, so why hide them now?

Crucially, automated tools also play a role in gatekeeping who gets access to public benefits. Benefit-screening has always meant the difference between eviction and a roof over New Yorkers’ heads, but in the middle of a pandemic, the consequences are truly life and death. Again, no matter how well-designed or well-intentioned the platform might be, its decisions are far too weighty to be hidden for the public.

The city’s benefits screener was redesigned with public feedback. Why wouldn’t the city now report this tool?

The Daily News Flash Newsletter

Weekdays

Catch up on the day’s top five stories every weekday afternoon.

These are a few of the many systems we know were left out of the city’s report. The disturbing truth is that there are certainly more than that. The systems we’ve listed had at least some prior disclosure, but others may be purchased and deployed in secret.

New Yorkers need to know about the tools that control how we are policed and served. We need to know if these tools can cost our children their freedom. We need to know if a tool could threaten their parental rights.

These nightmare scenarios aren’t hypothetical, they’ve happened before. In 2013, tech CEO-turned Michigan Gov. Rick Snyder decided to use AI to head off the risk of false unemployment claims. His solution, the Michigan Integrated Data Automated System (MIDAS). At first, the then-governor hailed the system as a success for raising tens of millions in new fines and fees. Then, the horror stories came, tens of thousands of residents wrongly judged guilty of fraud by an algorithm they could never see, driven to bankruptcy or worse.

Just one faulty system can imperil the futures of thousands, maybe millions of New Yorkers.

Hiding our city’s algorithms from the public is no different than trying to hide our law from public view. These are the sets of rules that determine how our city is governed, and we deserve to see them. Like the rest of the city’s policies, it’s time to make New York City’s automated decision tools — all of them — public.

Manis is research director at the Surveillance Technology Oversight Project (S.T.O.P.), a New York-based civil rights and privacy group. Cahn is the founder and executive director of S.T.O.P. and a fellow at the Engelberg Center for Innovation Law & Policy at N.Y.U. School of Law.