Automated Decision System Regulation Survey 2023

 
Artificial Intelligence bots behind computer screens.

I. Introduction

How do employers sift through enormous piles of resumes? How do schools, lenders, parole boards, and government agencies evaluate hundreds or thousands of applicants? Increasingly, they automate high-volume decisions. Schools and employers use Automated Decision Systems (“ADS”) to rank admissions and job candidates. Credit agencies use ADS to assess potential borrowers. Government agencies use ADS to determine individuals’ eligibility for benefits.

Some of these ADS automate calculations previously tabulated by office workers using spreadsheets. Others use machine learning (a type of artificial intelligence) to make decisions patterned on an organization’s previous decisions or other historical data. Some ADS replace human decision-making; others supplement it.

But whether simple or sophisticated, ADS have outpaced laws set up to protect job candidates, students, prospective borrowers, employees, benefits recipients—in short, citizens—from discrimination and other civil rights violations:

  • They can share individuals’ sensitive information with third parties, even law enforcement.

  • They undermine accountability. ADS frequently cut off due process (individualized explanations for decisions and redress). ADS licensed by private companies may even be hidden from the agencies using them through trade secrets law.

  • Even transparent ADS frequently perpetuate bias by using racist, sexist, and other biased data.

Legislators are catching up to the need to regulate ADS to prevent these harms. This database surveys states and key cities to identify proposed and enacted laws relating to ADS during municipalities’ 2023 legislative sessions.[1] (There is currently no nationwide legislation addressing the use and development of AI, though the Biden administration has published a Blueprint for an AI Bill of Rights and, as of September 2023, a bipartisan group of U.S. senators is drafting a plan to regulate AI.)

Click here for the database and see summaries below.

II. Enacted and Proposed Legislation

ADS bills typically take one of four forms: comprehensive privacy bills, subject-focused regulation, concern-motivated regulation, and working group formation.

  • Comprehensive Privacy Bills. Twelve states (CA, CO, CT, DE, FL, IA, MT, OR, TN, TX, UT and VA) have enacted comprehensive privacy laws since 2020. These frameworks limit ADS and other systems’ use of “sensitive data,” which typically includes biometric data, children’s data, and health data, and regulate automated processing of decisions (typically referred to as “profiling.”) Comprehensive privacy bills’ typically grant consumers the right to opt out of certain automated decisions (e.g. the California Privacy Rights Act, the Colorado Privacy Act, and the Connecticut Data Privacy Act). Comprehensive privacy bills often also require notice prior to collection of sensitive data (e.g. Utah’s Consumer Privacy Act and the Oregon Consumer Privacy Act). Some legislation additionally requires that data managers provide information on how data is used or whether it is disclosed to third parties (e.g., Massachusetts S.2687/H.4514).

  • Sector-Specific Regulation (e.g., criminal procedure, traffic, employment). This regulatory approach limits itself to a particular sector in which ADS are deployed. Examples include employment-related regulation (e.g.,  New York City’s Automated Employment Decision Tools law); schools regulation (e.g., New York’s facial recognition moratorium); automated traffic enforcement (e.g., Alabama SB 237 and Illinois SB 3423); insurance-centered regulation (e.g., Colorado SB21-169); consumer finance-centered regulation (e.g. New Jersey S1943); criminal justice regulations (e.g., Idaho Code 19-1910); and healthcare-related laws (e.g., Maine S656).

  • Transparency. These laws require ADS users to be accountable to the public and to impacted communities in particular. Most accountability laws restrict government use and procurement (i.e., purchase) of ADS. Some regulations do this by requiring inventories of government ADS currently in use (e.g., Vermont H.236). Others also require approval before the acquisition and use of surveillance technology and public disclosure of the type of surveillance (e.g. San Francisco’s Stop Secret Surveillance Law). Some require impact assessments to anticipate negative impacts and unintended consequences of ADS before they are acquired (e.g., California AB-331). Others require that agencies produce public accountability reports (e.g., Washington SB 5116).

  • Task Forces. Some states and several cities regulate ADS by placing them under the supervision of task forces (variously referred to as working groups, boards, and councils). These groups are typically composed of a mix of government representatives, academic experts, industry representatives and community representatives who review requests for ADS implementation and provide oversight for ADS usage (e.g., the Colorado Facial Recognition Task Force, New York City’s Automated Decision Systems Task Force, and Alabama’s Council on Advanced Technology and Artificial Intelligence.)

III. Headwinds to Legislation

Many legislative attempts to stop ADS-related harms have failed. Some failed bills appear to be victims of competing legislative priorities and limited legislator time, with legislatures promising to revisit bills in the upcoming legislative session. Other bills fall to stiff opposition from industry groups and other parties resisting regulation. We summarize this pushback below to prepare proponents of ADS bills.

  • Overreach, Uncertain Reach. Proponents of ADS bills should specify what counts as a regulated ADS, such as by including a list of representative covered and non-covered tools.

    Why? Opponents of ADS bills frequently contend that bills cover too much ground or uncertain ground. For example, they have argued that “automated decision systems” could include ordinary “spreadsheets” and claimed that ambiguously worded laws create uncertainty and confusion.

  • Redundancy. Advocates should explain why existing law is insufficient to regulate ADS.

    Why? Opponents of regulation argue that civil rights violations due to ADS are already prohibited under various federal and state laws (e.g., ADS-specific anti-discrimination laws).

  • Conflicts with Privacy, Cybersecurity. Proponents of ADS bills that increase accountability to regulators and the public should be prepared to address claims that increasing accountability increases the collection and reporting of sensitive data.

    Why? Industry groups sometimes contend that laws requiring accountability reporting and related data retention force companies to disclose and expose sensitive data or to ignore privacy laws’ data minimization requirements.

  • Compliance Costs: Proponents of ADS bills should be prepared to address traditional industry objections to regulation.

    Why? Critics of ADS bills often repeat old saws against industry regulation: regulation stifles companies’ innovation; laws requiring impact assessments impose heavy compliance costs; regulation exposes proprietary business information.

 
 

[1] This database covers a representative sample of laws in depth. See EPIC’s “The State of State AI Laws: 2023” for additional laws governing ADS.

 

Download The Database

 
Excel logo.

ADS Legislation