Regulations on the Rise as Employers Experiment with AI
Workspan Daily
June 20, 2023
Key Takeaways

  • Avoiding discrimination. With the rapid rise of ChatGPT and AI in employment decisions, there is a growing regulatory focus on potential bias in AI and automated tools. 
  • On the state level. New York City recently published its final regulations implementing its first-in-the-nation ordinance that regulates the use of AI-driven hiring. Enforcement of the law will begin on July 5. 
  • On the federal level. The White House Office of Science and Technology Policy also recently announced that it would be releasing a public request for information to learn more about the automated tools used by employers to surveil, monitor, evaluate and manage workers. 

With more organizations utilizing ChatGPT and artificial intelligence (AI) in their employment decisions, such as recruitment and hiring, there is now a growing regulatory focus on potential bias in AI and automated tools. 

During the “Taming Artificial Intelligence in Employment Decision Making” panel at WorldatWork’s Total Rewards’23 Event, Victoria Lipnic, partner at Resolution Economics LLC, said this shift from Big Data to AI in employment has led to much public debate.  

“It’s amazing and frightening, but where is this all heading?” she asked the audience. 

According to Lipnic, it’s more regulations at all levels of government. 

For example, New York City recently published its final regulations implementing its first-in-the-nation ordinance that regulates the use of AI-driven hiring tools (Local Law 144 of 2021, or NYC 144). Enforcement of the law is scheduled to begin on July 5.  

According to labor and employment law firm Littler Mendelson P.C., NYC 144 prohibits employers or employment agencies from using an automated employment decision tool (AEDT) to make an employment decision unless: the tool is annually audited for bias; the employer publishes a public summary of the audit; and the employer provides certain notices to applicants and employees who are subject to screening by the tool. 

To prepare for a bias audit, Ye Zhang, a director at Resolution Economics, said employers should know the AI/automated tool they are using and how they are using it. For instance:  

  • At what stage of the hiring process is the AI/automated tool being used? 
  • Is the tool replacing a part of the hiring process previously conducted by humans? Or is the tool adding a new component to their hiring process? 
  • Have they kept all the documentation and data the AI vendor used to set up, configure and train the algorithm? If not, can they obtain it from the vendor? 

Several other jurisdictions are considering regulations of AI-enabled and automated employment tools, according to the panelists. They include:  

  • Washington, D.C.: The Stop Discrimination by Algorithms Act, which would make it illegal to use discriminatory algorithms to make decisions about key areas of “life opportunity,” including employment, and would require companies using algorithmic processes/tools to conduct annual audits of those processes/tools.  
  • California: The California Civil Rights Council published draft modifications to its employment anti-discrimination laws, which would impose liability on companies or third-party agencies administrating AI tools that have a discriminatory impact. 
  • New Jersey and Vermont, which are both considering legislation regulating AI/automated employment tools.  

In addition, the White House Office of Science and Technology Policy (OSTP) announced in May that it would be releasing a public request for information (RFI) to learn more about the automated tools used by employers to surveil, monitor, evaluate and manage workers. And the Equal Employment Opportunity Commission (EEOC) met earlier this year to discuss the idea of requiring audits for AI tools. 

“The promise of AI is that [employers] will find gems, but the peril of using AI is the potential for discrimination,” Lipnic said. “So, if you are using an AI tool, somewhere in the employment life cycle, you will need to do an audit.”

Editor’s Note: Additional Content 

For more information and resources related to this article see the pages below, which offer quick access to all WorldatWork content on these topics: 

Related WorldatWork Resources
How Gen Z is Redefining Total Rewards
Pennsylvania Court’s FTC Noncompete Ruling Counters that in Texas Case
Is HR the Next Big Target for Deepfake Hackers?
Related WorldatWork Courses
Sales Compensation: Foundation and Core Principles
Sales Compensation: Advanced Implementation and Program Management
Sales Compensation Course Series
Feedback