The County Lens: Do Counties Have a Stake in Artificial Intelligence Regulation?

5/13/2026 Joe McCoy

Artificial Intelligence is rapidly becoming a prevalent tool within the private and public sectors. As with many other technological innovations that have transformed society, the technology is outpacing the regulatory environment and leaving federal and state lawmakers scrambling to catch up. It would be wise for county officials to pay close attention to the proposed policies.

Illinois lawmakers continue to explore the regulation of artificial intelligence (AI) technologies with the introduction of several Senate floor amendments that would establish new consumer protections, transparency standards, data privacy requirements, and educational safeguards related to AI systems operating in the state. The proposals come as states across the country increasingly move to regulate AI technologies amid growing concerns over consumer protection, public safety, misinformation, and the impact of AI systems on children and vulnerable individuals.

The measures are part of a broader artificial intelligence package recently unveiled by members of the Illinois Senate Democratic Caucus. According to Senate Democrats, the legislative package is intended to establish safeguards addressing mental health concerns, identity security, price gouging protections, transparency requirements, and other emerging AI-related issues.

Supporters of additional oversight point to several high-profile incidents involving AI chatbots allegedly encouraging self-harm or dangerous behavior among teenagers. Representative Daniel Didech recently stated during a hearing on AI regulation that “the self-regulation of chatbot development has failed,” arguing that government oversight is necessary to address the risks associated with rapidly evolving AI technologies.

Senate Floor Amendment 1 to SB 315 (Senator Edly-Allen) would create the “Artificial Intelligence Safety Measures Act” and establish a regulatory framework for large frontier AI developers. The proposal would require covered developers to create and annually update AI safety frameworks addressing catastrophic-risk assessments, cybersecurity protections, governance practices, third-party evaluations, and internal-use risks. The amendment would also require transparency reports before deployment of new or substantially modified frontier AI models and mandate annual independent third-party audits. In addition, the legislation would establish reporting requirements for critical safety incidents and authorize the Illinois Emergency Management Agency and Office of Homeland Security, in consultation with the Illinois Attorney General, to administer reporting systems and issue implementing rules. The proposal also includes whistleblower protections for employees who disclose alleged violations and creates civil penalties for noncompliance. Certain records related to the Act would be exempt from disclosure under the Freedom of Information Act. The legislation would take effect January 1, 2027.

The proposal mirrors similar regulatory efforts already underway in states such as California and New York, which supporters say are attempting to establish baseline safety standards for advanced AI systems. Representative Didech has indicated that the legislation is intended to require developers to adopt safety plans designed to mitigate catastrophic risks, including misuse of AI systems, cybersecurity threats, and chatbot behavior that could encourage harmful conduct.

Senator Edly-Allen described the proposal as an effort to balance innovation with public safety concerns. “Artificial intelligence can be a powerful tool for good, but currently there are minimal guardrails in place. It's like the ‘Wild, Wild West.’ Illinois needs to create a roadmap for responsible innovation to prevent catastrophic risks,” Edly-Allen said. She further noted that the legislation is intended to require transparency, third-party audits, and whistleblower protections for employees reporting potential violations.

Senate Floor Amendment 1 to SB 316 (Senator Ellman) would create the “Artificial Intelligence Companion Model Safety Act.” The proposal focuses on AI companion systems and would require operators to implement protocols capable of detecting suicidal ideation or expressions of self-harm by users interacting with AI companion models. Operators would also be required to clearly disclose to users, in the language being used during the interaction, that the user is communicating with an automated system rather than a human. Additional protections would apply to minors, including requirements to implement reasonable measures to prevent AI companion systems from generating sexually explicit content for minors or encouraging sexually explicit conduct. Violations of the Act would constitute unlawful practices under the Consumer Fraud and Deceptive Business Practices Act. The amendment also makes conforming changes to the State Finance Act and Consumer Fraud and Deceptive Business Practices Act. The proposal would take effect January 1, 2027.

The amendment aligns with broader legislative efforts aimed at protecting children and vulnerable users from harmful AI interactions. Lawmakers have increasingly focused on concerns that AI chatbots may reinforce dangerous thoughts or behaviors because conversational AI systems are often designed to validate and continue user interactions. Jennifer Gong-Gershowitz recently noted that, unlike trained mental health professionals, AI chatbots may fail to appropriately respond to warning signs involving suicidal ideation or psychosis.

Senator Ellman stated that AI technologies should not further isolate individuals experiencing emotional distress. “We are seeing more and more people, especially kids and teens, turn to AI companions for emotional support and conversation,” Ellman said. “Technology should support people, not isolate them further during moments of vulnerability.” She added that Illinois should take a leadership role in establishing “morally responsible guardrails” for AI systems.

Senate Floor Amendment 1 to SB 317 (Senator Ventura) would create the “Consumer Artificial Intelligence Notice Act.” The legislation would require businesses using conversational AI systems in chat interfaces to provide consumers with a clear and conspicuous disclosure at the beginning of an interaction that the consumer is communicating with an automated system and not a human being. Violations would be treated as unlawful practices under the Consumer Fraud and Deceptive Business Practices Act. The amendment would also expressly preempt home rule authority related to the subject matter of the Act and make conforming statutory changes.

The disclosure requirements contained in SB 317 reflect growing concern among lawmakers regarding the increasingly sophisticated nature of conversational AI systems and the difficulty consumers may face distinguishing between AI systems and human representatives. Senator Ventura has stated that AI technology has become advanced enough that consumers should receive clear notice when interacting with automated systems. Ventura stated that consumers “deserve to know whether they are communicating with a human or an AI system from the very beginning of an interaction.”

Senate Floor Amendment 1 to SB 318 (Senator Stadelman) would create the “Prohibition on Bots Purchasing Tickets Act.” The proposal would prohibit the use of bots or other methods to purchase tickets in excess of posted limits for online ticket sales, circumvent electronic queues or waiting periods, or use multiple accounts or internet protocol addresses to evade ticket purchasing restrictions. The legislation would also prohibit ticket resale marketplaces and resellers from falsely representing that they are affiliated with or endorsed by an artist, sports team, venue, or event organizer. In addition, venue operators or ticket issuers would be required to disclose the number of tickets withheld from public sale when tickets are offered for sale to consumers. Violations of the Act would constitute unlawful practices under the Consumer Fraud and Deceptive Business Practices Act, with enforcement authority granted to the Illinois Attorney General.

Senator Stadelman stated that the proposal is intended to address concerns that automated ticket purchasing systems unfairly drive up prices for consumers. According to Stadelman, the legislation seeks to promote transparency and create “a more honest and fair system that keeps consumers informed before making an online ticket purchase.”

Senate Floor Amendment 1 to SB 340 (Senator Murphy) would create the “Illinois Consumer Data Privacy Act.” The proposal would apply to certain businesses conducting operations in Illinois that collect or process significant amounts of consumer personal data. The legislation would establish numerous consumer privacy rights, including the right to access, correct, delete, and opt out of the processing or sale of personal data. The amendment would also regulate data controllers and processors, establish enforcement authority for the Attorney General and State’s Attorneys, and limit the concurrent exercise of home rule powers. Violations would constitute unlawful practices under the Consumer Fraud and Deceptive Business Practices Act. The proposal would take effect January 1, 2027.

Senator Murphy stated that Illinois “needs strong guardrails in place to protect consumers’ personal data from being collected by companies,” adding that the legislation is intended to empower consumers by allowing them to opt out of the sale and use of their data.

Senate Floor Amendment 1 to SB 415 (Senator Villa) would amend the School Code to restrict school districts from purchasing or acquiring biometric systems for use on students except for legitimate instructional purposes, as determined by the school district. The proposal would also require school districts, by the 2027-2028 school year, to ensure any biometric systems used on students are limited to legitimate instructional purposes. The amendment contains exceptions for systems in which biometric capabilities are disabled or where a district has no reasonable knowledge that purchased software contains biometric capabilities.

The proposal aligns with broader legislative discussions concerning student privacy and the use of biometric information in educational settings. Senator Villa stated that “no child should be subjected to invasive surveillance or risk having their sensitive personal data collected and misused,” emphasizing the importance of protecting student privacy and preventing discrimination involving biometric data.

Senate Floor Amendment 1 to SB 416 (Senator Martwick) would create the “Student Educational Technology Rights Act.” Under the proposal, beginning with the 2026-2027 school year, each school board would be required to adopt a policy prohibiting teachers from using artificial intelligence tools to assign numerical scores or grades for tasks requiring professional judgment. The legislation would also require school board approval for any artificial intelligence model used in relation to students or student work.

Senator Martwick stated that students “deserve evaluations that reflect human judgement, context and understanding – not automated systems.” According to Martwick, the proposal is intended to protect student privacy, maintain academic integrity, and ensure technology is used responsibly in educational settings.

Collectively, the measures represent a significant effort by Illinois policymakers to establish guardrails for emerging AI technologies while broader federal regulation remains under discussion in Congress. ISACo will continue monitoring these measures and provide updates as they move through the legislative process.

Potential County Impacts

While many of the proposals in the Senate Democratic AI package are primarily directed toward private-sector technology companies and consumer protection, several measures could have direct or indirect implications for counties, school districts, and other units of local government.

The most direct governmental impacts appear in legislation involving statewide oversight, home rule limitations, educational technology requirements, and consumer data privacy standards. Senate Bill 315 would grant administrative and rulemaking authority to the Illinois Emergency Management Agency and Office of Homeland Security, while also involving the Illinois Attorney General in enforcement and oversight activities related to AI safety reporting and incident response. Although the legislation does not directly impose operational mandates on counties, it could influence local emergency management operations, cybersecurity planning, public safety coordination, and future procurement standards for AI-related software and technology vendors utilized by county governments.

Several proposals also contain provisions limiting local regulatory authority. Both SB 317, involving AI chatbot disclosure requirements, and SB 340, involving consumer data privacy protections, include limitations on the concurrent exercise of home rule powers. These provisions could restrict the ability of home rule counties and municipalities, including Cook County, to establish separate or conflicting local regulations in these policy areas.

The package could also affect local governments as counties and municipalities increasingly utilize AI-enabled customer service systems, automated communication tools, and data-processing technologies. Local governments deploying conversational AI systems for public information, permitting, tax administration, or constituent services may eventually need to evaluate whether state disclosure requirements or data privacy obligations apply to those systems.

Additionally, several measures could affect local education governance and school district operations. SB 415 would establish statewide standards governing the use of biometric systems involving students, while SB 416 would require school boards to adopt policies regulating the use of artificial intelligence in grading and classroom settings. These proposals could create future policy, compliance, and technology review obligations for local school districts.

More broadly, the overall legislative package reflects a growing state-level effort to establish guardrails governing artificial intelligence, automated systems, consumer data, and digital technologies. As these standards evolve, counties and other local governments may experience indirect operational, procurement, legal, and compliance impacts involving technology vendors, public-facing communication systems, cybersecurity practices, and data management policies.

ISACo will continue monitoring these proposed policies to determine what impact they may have on county operations and local government authority.