HUMAN CONTROL OF AI ACT

Session: 104th General Assembly
Year: 2026
Bill #: HB4980
Category: Artificial Intelligence and Data Centers
Position: Oppose
Mandate? Yes
Revenue Loss?
Authority Preemption?

View bill

Summary as Introduced

Creates the Meaningful Human Control of Artificial Intelligence Act. Provides that a public employer, including a contractor or subcontractor of the public employer, shall not use or apply, or authorize any procurement, purchase, or acquisition of any service or system using or relying on any automated decision-making system, directly or indirectly, without meaningful and continuing human review when performing specified functions. Provides that an employer seeking to use or apply an automated decision-making system shall conduct an initial impact assessment and an additional impact assessments at least once every 2 years and prior to any material changes to the automated decision-making system. Sets forth provisions concerning anti-retaliation; enforcement by the Department of Labor; and private rights of action.

Staff Analysis

The introduction of House Bill 4980, known as the Meaningful Human Control of Artificial Intelligence Act, marks a significant legislative effort in Illinois to regulate the intersection of public administration and emerging technology. At its core, the proposed law seeks to prevent the unchecked delegation of government authority to automated decision-making systems. By mandating that humans remain "in the loop," the ill aims to preserve accountability and transparency in functions ranging from public employment to the distribution of essential social services. For county governments, which often serve as the primary interface between citizens and the state, this legislation introduces a complex set of new mandates, operational hurdles, and legal liabilities.

The scope of the Act is broad, defining automated decision-making systems as any software that uses algorithms or artificial intelligence to automate, support, or replace human judgment. Under this framework, a county government or its contractors would be prohibited from using such systems unless they ensure meaningful and continuing human review. This requirement is not a mere formality; it dictates that the human overseer must have the technical understanding to interpret the system's logic and the actual authority to override its conclusions. Furthermore, the bill explicitly forbids the use of these technologies to "predict" an individual’s personality or emotional state, a provision that targets the growing use of AI in hiring and psychological assessment.

For county administrative offices, the most immediate impact would be felt in the procurement and implementation phases of new software. Before any automated system can be deployed, the county would be required to perform a comprehensive initial impact assessment. This assessment must document the specific objectives of the system, the data sets used to train it, and a rigorous analysis of whether the tool could result in discriminatory outcomes. These reports would not be one-time requirements, as the bill mandates updated assessments every two years to account for "algorithmic drift" or changes in how the software processes local data. This creates a perpetual cycle of technical auditing that many smaller or mid-sized counties may not currently have the internal expertise to manage.

Labor relations and human resources within county governments would also face a fundamental shift. The Act stipulates that no public employer may implement an automated system that affects workers without first notifying and negotiating with relevant labor organizations. This effectively makes the use of AI a mandatory subject of collective bargaining. If a county department wishes to use software to track employee productivity or screen job applicants, it must prove to union representatives and the state that these tools do not infringe upon existing employee rights or safety standards. This layer of oversight is intended to protect public servants from "management by algorithm," where workers might otherwise be penalized by software they do not understand and cannot challenge.

The administration of public benefits represents another area of high stakes under HB 4980. Counties often manage local housing, health, and emergency assistance programs that rely on automated logic to determine eligibility. The Act classifies these as high-impact areas, requiring that any decision to deny, reduce, or terminate benefits be personally verified by a human employee. This employee must ensure the accuracy of the underlying data and be prepared to provide a clear, written explanation of the decision to the affected resident. While this provides a vital safeguard for vulnerable populations, it also risks slowing down the speed of service delivery, as automated systems that once processed applications in seconds will now require manual intervention at key decision points.

Finally, the legislation introduces a significant new layer of legal risk for county boards and taxpayers. By establishing a private right of action, the bill allows individuals who believe they have been harmed by a non-compliant automated system to sue the county for damages and attorney’s fees. This creates a powerful incentive for strict compliance but also opens the door to litigation over software glitches or "black-box" decisions that were not properly vetted. Consequently, Illinois counties would need to invest heavily in legal and technical compliance frameworks to avoid the high costs of defending their technological infrastructure in court. Ultimately, while the Act champions the rights of the individual against the machine, it places the burden of proof and the cost of caution squarely on the shoulders of local government.



Back to Bill List