Study shows how AI exacerbates recruitment bias against women


A new study from the University of Melbourne has demonstrated how hiring algorithms can amplify human gender biases against women.

Researchers from the University of Melbourne gave 40 recruiters real-life resumés for jobs at UniBank, which funded the study. The resumés were for roles as a data analyst, finance officer, and recruitment officer, which Australian Bureau of Statistics data shows are respectively male-dominated, gender-balanced, and female-dominated positions.

Half of the recruitment panel was given resumés with the candidate’s stated gender. The other half was given the exact same resumés, but with traditionally female names and male ones interchanged. For instance, they might switch “Mark” to “Sarah” and “Rachel” to “John.”

The panelists were then instructed to rank each candidate and collectively pick the top and bottom three resumés for each role. The researchers then reviewed their decisions.

[Read: How to build a search engine for criminal data]

They found that the recruiters consistently preferred resumés from the apparently male candidates — even though they had the same qualifications and experience as the women. Both male and female panelists were more likely to give men’s resumés a higher rank.

Credit: The University of Melbourne