In 2022, New York City passed a law regulating the use of artificial intelligence to assess candidates for hiring or promotion. Local Law 144, which went into effect July 5, 2023, requires employers to conduct a “bias audit” to determine if the AI tool discriminates against candidates in protected categories. The audit must determine the “selection rate” for candidates based on race, color, national origin, sex, age and other criteria.
The Problem of Bias when Using AI in the Recruiting Process
The Problem of Bias when Using AI in the Recruiting Process
Employers must post a summary of the audit results on their websites and give candidates at least 10 days’ notice that the AI tool will be used. The notice must also include information on the type and source of the data the AI uses tool, and the characteristics and job qualifications used to assess candidates. Candidates must be given an opportunity to request a different evaluation process or accommodation.
The law applies to jobs located in a New York City office, at least part time, and fully remote jobs associated with a New York City office. It also applies to employment agencies located in New York City.
Why AI Can Be Biased
Illinois and Maryland have also passed laws regulating the use of AI tools in recruitment, and the Equal Employment Opportunity Commission has issued guidance on their use. Other jurisdictions are likely to follow. These regulations recognize that AI tools can make biased decisions.
Machine learning bias occurs when the system makes inherently discriminatory predictions because the developer’s prejudices are reflected in the algorithm, or inaccurate, incomplete or discriminatory data is used to train the system. When machine learning is used in recruitment, bias can affect how the tool rates a candidate’s potential performance.
Biases can also be created by confidence barriers associated with the candidate’s cultural background. Some people are more confident answering specific kinds of questions than others simply because the question itself may have a cultural impact. This changes the way the AI recruiting tool rates the individual. The tool may eliminate perfectly capable candidates for reasons of cultural, social or even political bias.
More of What You Have
Language barriers can affect the AI recruiting tool’s predictions as well. Someone who speaks English as a second language may need more time to read and comprehend the question than a native English speaker. The AI recruiting tool may give that individual a lower score based on response time.
Some AI recruiting tools convert a candidate’s speech to text and compare it to the speech of the organization’s existing employees. A candidate who uses similar language and phrasing will be rated higher than one who doesn’t. In essence, the AI recruiting tool is looking for candidates who are similar to existing staff. However, the speech-to-text process doesn’t necessarily reflect how well the person will perform the job.
The reality is that AI recruiting tools seek candidates who are similar to existing staff. It is the antithesis of diversity because the AI recruiting tool looks for more of what you have.
The DeSeMa Difference
DeSeMa does not use AI recruiting tools in our talent appraisal process. Our expert hiring managers review the results of our comprehensive skills and personality testing. We set benchmarks that enable us to evaluate the candidate in the context of the position and the organization as a whole.
Our experienced architects also conduct deep technical interviews and half-day working sessions with each candidate. Interviews with our team-building coaches and mock working sessions with other consultants help us gauge how the candidate performs in a real-world environment.
Governments are regulating the use of AI in recruitment because AI can be biased, creating challenges and disks for organizations seeking to use these tools. DeSeMa’s talent appraisal process will identify the candidate who will best meet your needs and objectives rather than someone who is similar to those already on your team.