NY Daily News - Ban ‘digital stop-and-frisk’: Facial recognition technology too prone to errors

New Yorkers marched in the streets last summer demanding an end to biased law enforcement practices. Our government responded by passing new laws to root out systemic racism and unjust policing practices.

A debate is raging in New York right now about one of these unjust practices: facial recognition. Studies show this technology is inaccurate and error-prone, especially when dealing with Black, Latinx or gender nonconforming people. It exacerbates bias in law enforcement — and we must prevent its use in New York.

Let’s be clear, facial recognition errors are not some abstract harm. A facial recognition error could lead to a New Yorker being wrongfully stopped on the side of the road, taken from his or her family, put in a cage, and charged with a crime that he or she did not commit. Facial recognition means facing the risk of a police stop at the very moment that we know just how dangerous a police encounter can be. A facial recognition error could quickly escalate into not just a pair of handcuffs, but even a knee to the neck.

We ought to celebrate Minneapolis for protecting civil rights. When Minneapolis became the 14th city to outlaw facial recognition, it was particularly poignant. In the city where George Floyd was killed, biased police technology that put residents at risk was outlawed. New York needs to do the same.

The NYPD has proven that it can’t be trusted to use this sort of invasive spy tool. Even when used properly, facial recognition has real risks, but the NYPD has not used their system properly — not even close. In the past, this has meant officers ran facial recognition searches on everything from celebrity look-alikes to hand-drawn sketches. Can’t find a facial recognition match for someone who looks like the subject? That’s fine, just find an image of an actor who looks similar, as the NYPD has done.

While the NYPD did eventually prohibit look-alike searches, they still encourage officers to alter images, changing elements before running a search. If a suspect’s eyes are closed, the NYPD makes them appear open. If the mouth is open, they paste it closed. Even entire jawlines have been copied and pasted from Google image searches when police images only capture the suspect’s profile.

The Daily News Flash Newsletter

Weekdays

Catch up on the day’s top five stories every weekday afternoon.

In short, it’s complete pseudoscience. The way the NYPD uses facial recognition has all the scientific validity of now-discredited technologies like bite-mark and hair sample analysis. This evidence is putting New Yorkers, particularly New Yorkers of color, at risk of wrongful conviction. With prior forms of forensic science, we had to wait decades to exonerate the innocent, but we shouldn’t have to wait that long to ban facial recognition.

This is why we came together with Amnesty International and the Ban the Scan coalition recently to call for New York to immediately outlaw government use of facial recognition. This is work that both Albany and City Hall must take up.

On the state level, New York should expand last year’s landmark ban on facial recognition in schools by passing Senate Bill S79, which would ban all law enforcement use of facial recognition across New York State.

Working with Amnesty International, we want New York to go from one of the leading users of facial recognition to the most prominent opponent. Together, we can work with activists and lawmakers across the world, standing in solidarity against this terrifying technology.

The truth is that we have been down this road before. For years, New Yorkers accepted the lie that stop-and-frisk would protect us. Many believed the lies that the abusive tactic could be targeted and refined. Eventually, after hundreds of thousands of New Yorkers were traumatized by the tactic, we finally admitted our mistake. We can’t afford to take so long to fix another big mistake.

Hoylman is a state senator from Manhattan. Cahn is the founder and executive director of Surveillance Technology Oversight Project (S.T.O.P.), a civil rights and privacy group, and a fellow at the Engelberg Center for Innovation Law & Policy at N.Y.U. School of Law.