How bias shows up in AI resume screening
Research summary showing why we must be proactive in bias mitigation
Hi friends,
A new study on bias in AI resume screening just came out! It showed that AI screening tools have gender and racial biases. It also showed that these biases are probably greater than the equivalent is human resume screenings.
Today I’ll highlight key points. I recommend reading the full paper!
“Gender, Race, and Intersectional Bias in Resume Screening via Language Model Retrieval” by Kyra Wilson and Aylin Caliskan
You can read a summary by the Brookings Institute, and the full paper.
What the study tested
The researchers set up algorithms to simulate AI-based candidate selection. They ran them on real job descriptions across nine professions and 500 real resumes, augmenting them with names associated with different genders and races.
The professions they tested for are: (1) Chief Executive, (2) Marketing and Sales Manager, (3) Miscellaneous Manager, (4) Human Resources Worker, (5) Accountant and Auditor, (6) Miscellaneous Engineer, (7) Secondary School Teacher, (8) Designer, and (9) Miscellaneous Sales and Related Worker.
Through 27 different tests, the researchers had the AI recommend resumes for the job. In each case, they compared how many of each gender and race group were selected by the algorithm.
Key results
These are excerpts from the Brookings summary and the charts are by me:
Gender bias - Men’s and women’s names were selected at equal rates in only 37% of cases. In the rest, resumes with men’s names were favored 51.9% of the time, while women’s names were favored just 11.1% of the time.
Racial bias - Racial bias was even more pronounced—resumes with Black- and white-associated names were selected at equal rates in only 6.3% of tests. White-associated names were preferred in 85.1% of cases, while Black-associated names led in just 8.6%.
Greater bias than human resume screening - Disparities in resume selections did not necessarily correlate with existing disparities in workforce employment for gender or race, suggesting that using AI screening mechanisms could either alter or increase disparities in sectors and occupations where they do not already exist.
Recommendations
This study wasn’t conducted on AI resume selection tools that the researchers carefully designed to simulate real ones. Real tools probably vary in their levels of bias, but there is only one way to know for sure: Test for bias.
The researchers recommend conducting audits, which can be very helpful. But even if you cannot conduct an audit, I would recommend that you experiment with the tool to at least get a sense of potential outcomes. You can even try to mimic the methodology of this research and use the types of prompts and data they have used!
Dessert
An AI-generated take on this post!
Ready for more?
Read more of my posts about AI bias, such as bias in AI generated reference letters, and bias in image generation. Search for “bias” in the AI Treasure Chest Archive to see them all.