Using artificial intelligence to reduce bias in recruitment

The results are in and clear – diverse organisations do better. Using audited AI in the process can help avoid workplace bias.

Money is not everything in life. Past a certain threshold, money will not make people happier. Below that threshold, money can make a huge difference. Too often, those below that threshold have been overrepresented by societal groups defined by gender, ethnicity, race, sexuality or neuro-diversity. In the search and allocation of good quality jobs, we are failing to be representative.

The question for recruiters is why this failure is taking place. Recruitment measures are outdated. An understanding of psychology is rudimentary; and processes are led by a 2D review of resumes, often screened artificial intelligence with no ability to contextualise.

A 3D approach to job applicants takes into accounts soft skills. These range from decision making to numerical agility, from risk tolerance to generosity. Not only do soft skills signal inherent potential; they are equally distributed and unbiased. Around 92% of hiring managers weight soft skills at least as highly as resume qualifications; but 75% of companies have difficulty assessing the soft skills of graduate applicants. Human beings are modular. There is no one-size-fits-all approach.

Humans beings are modular. There is no one-size-fits-all approach.
Frida Polli, Founder and CEO, pymetrics

pymetrics provides three ways of overcoming the soft skills gap. Scientific innovation provides a better basis of measuring soft skills; tests can be administered with the support of artificial intelligence; and the methodology is fully audited and transparent to remove ethnic and gender bias.

There is a danger with AI that it replicates bias found in the wider world. This can be seen by how it accounts for women's colleges against co-eds. Some of the largest companies in the world have had to shut down AI hiring tools because they found them biased. This does not mean we should give up on AI, quite the opposite. Human-based recruitment processes are also biased – but unlike AI, humans cannot be audited, explained, or reprogrammed. Algorithms should be designed so that they can continuously and dynamically checked for bias.

The AI used by Pymetrics has increased job offers made to women by firms using the service by 74%; and has doubled the number of job offers to ethnic minority candidates.

Algorithms should be designed so that they can continuously and dynamically be checked for bias.
Frida Polli, Founder and CEO, pymetrics

And why is diversity important? A 2019 study by Gartner suggests that gender-inclusive and diverse teams can outperform their non-diverse counterparts by up to 50%. The difference between employee performance in diverse and non-diverse organisations is estimated at 12%.

Transparent and audited AI is one of the ways to achieve that diversity.

Summary based on Swiss Re Institute event: Algorithms for hope.

Algorithms for hope

Case studies with a positive global impact as we move from human to augmented intelligence. Swiss Re Institute in partnership with Gottlieb Duttweiler Institute and IBM Research.

Discover