RE: Int. 1894-2020 - Sale of Automated Employment Decision Tools.

PDF Version

November 11, 2020 

Hon. Laurie A. Cumbo
New York City Council Majority Leader
250 Broadway, Suite 1833
New York, NY 10007
via email

RE: Int. 1894-2020 - Sale of Automated Employment Decision Tools.

Dear Council Member Cumbo:

We, the undersigned civil rights, labor, and civil society organizations commend you for your leadership in tackling the discriminatory threat of automated employment decision tools. We urge the Council to require employers and hiring technology vendors to proactively measure and remediate disparate impacts, and consider less discriminatory alternatives. While we are glad to see this issue getting much needed attention, we are quite concerned that the current language of Int. 1894 could prove counterproductive in the fight against algorithmic discrimination.

We have flagged a number of concerns with the existing language below, and we’d welcome the opportunity to meet with you and your staff to discuss potential changes:

●      Definition of “automated employment decision tool”: Currently, this definition is underinclusive, capturing only a small portion of the technologies and processes that are currently or potentially used in employment settings. We would recommend a more expansive definition that would capture the full range of hiring technologies deployed in New York City, including applicant tracking systems, digital versions of psychological and personality assessments, and other complex procedures that do not fit cleanly within Int. 1894's current scope.

o   One possible formation is: “Automated Employment Decision Tools are any software, system, or process that aims to automate, aid, or replace human decision-making relevant to employment. Automated Employment Decision Tools can include both tools that analyze datasets to generate scores, predictions, classifications, or some recommended action(s) that are used by employers to make decisions regarding employees, contractors, and jobs candidates.”[1]

●      Definition of bias audit: Today, relatively little is publicly known about hiring technology vendors’ auditing processes.[2] Existing law and federal agency guidance also do not provide clear and robust standards for reviewing the discriminatory impacts of hiring tools and processes.[3] We are concerned that the current language would allow employers and vendors to comply with the law by conducting a pro forma, internal audit, without any meaningful opportunity for third party review. In addition to mandating that annual bias audits be conducted by independent third parties, we recommend that workers be given the opportunity to audit any hiring process for  bias. The Committee will need to work, together with a range of stakeholders, to define auditing procedures that include statistical testing, accessibility testing, and proactive consideration of less discriminatory alternatives.

●      Thorough disparate impact audits must involve both vendors and employers: Compliance with §8-107 cannot be established through a pre-sale audit alone. That law dictates that disparate impacts be measured with respect to the relevant applicant pool or available workforce for a particular job. Such measurement requires data from employers. Similarly, the “business objective” defense turns on its relationship to the particular job and employer.

●      Liability for biased tools: Currently, no provision of this bill would penalize the sale or use of an Automated Employment Decision Tool that is found to be biased. While such a system may create liability for the vendor and employer under existing New York Human Rights Laws, we urge you to also establish liability here.

●      Definition of “employment decision”: Currently, this definition is underinclusive, capturing only a small subset of the employment decisions that are made by automated employment decision tools.

●      Private right of action: We fear that even the best possible automated employment decision tool law will be little more than a dead letter in the absence of a private right of action. In addition to the existing civil penalties, we would urge you to include a private right of action for any employee, contractor, or applicant who is subjected to a biased automated employment decision tool.

●      Attorneys’ fees: To ensure that all New Yorkers are able to avail themselves of a private right of action under this law, we would also urge you to provide attorneys’ fees for prevailing plaintiffs. This will ensure that low-income employees, contractors, and job applicants will be able to have their day in court.

●      Non-exclusivity: We urge you to clarify that compliance with Int. 1894 does not preclude a private right of action or agency enforcement action under any other provision of New York City law. In short, compliance with Int. 1894 should be a floor, not a ceiling, for compliance with non-discrimination protections.

●      Reporting: We urge you to require mandatory reporting to the New York Commission on Human Rights, disclosing the results of any Automated Employment Decision Tool audits. The Commission should provide test results to the public to the full extent possible, as well as maintaining a “banned list” of any Automated Employment Decision Tool found to be biased in the prior year.

●      Government hiring: We urge you to ensure that this legislation applies with full force to any Automated Employment Decision Tool used by New York City agencies. Government hiring must not be held to a lower standard for fairness than what we require for the private sector.

To reiterate, we are grateful for your leadership on this matter, and we hope that we can work with your office to draft language that ensures the spirit of this legislation is fully realized in the years ahead. Unfortunately, these concerns will also make it impossible for us to support passage of Int. 1894 as currently drafted

Sincerely,

AI Now Institute at NYU
BetaNYC
Cryptoharlem
Data for Black Lives
The Legal Aid Society of NYC
NAACP Legal Defense and Educational Fund
National Employment Law Project
New York Civil Liberties Union
New York Communities For Change
OceanHill Brownsville Alliance
S.T.O.P. - The Surveillance Technology Oversight Project
Upturn

CC:      Intro. 1894 Co-Sponsors
New York City Council Technology Committee Members

[1] See Rashida Richardson, ed., “Confronting Black Boxes: A Shadow Report of the New York City Automated Decision System Task Force,” P. 20, AI Now Institute, December 4, 2019, https://ainowinstitute.org/ads-shadowreport-2019.htm

[2]  See Manish Raghavan, et al., "Mitigating Bias in Algorithmic Hiring: Evaluating Claims and Practices," https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3408010.

[3] For example, the Uniform Guidelines on Employee Selection Procedures (UGESP) suggest using a four-fifths impact ratio as a general rule for measuring disparate impact, both the EEOC and OFCCP use additional measures, such as statistical significance tests, when investigating disparate impacts, and courts have refused to adopt a single arithmetic measure of discrimination, acknowledging that the right measurement depends on the context.

 

communications staff