How to Use AI as a Force for Diversity

Why embrace diversity? As it turns out, diversity is good for business. The 20 most diverse companies in the S&P 500 achieve higher long term profitability than their less diverse counterparts.

According to Vildan Stidham, Divisional VP of Global Talent Acquisition at Abbott, “[Diversity and Inclusion] can bring innovation, creative thinking, and different perspectives that are essential in our growing business.” By 2025, advancing gender equality in the workplace could add $12 trillion to global GDP.

Despite the benefits of a diverse workforce, 48% of businesses either aren’t on track to meet their diversity goals or have no goals at all. Even for companies with more enthusiasm toward the subject, overcoming the bias in recruiting employees can be difficult.

On average, human recruiters spend 7 seconds in total reviewing an individual resume. At that speed, biases like the similarity effect (ranking those similar to oneself in demographics or interests more highly) or the contrast effect (tendency to exaggerate the qualification of whoever follows a weak candidate, or vice versa) are sure to impact their decision. 

Of course, not all recruiting is done by humans anymore. In companies of all sizes, the vast majority rely on software to filter applications, but bias exists in recruiting software as well. Search results that rely on exact keywords reflect a candidate’s ability to write a resume containing keywords, not their actual qualifications. While allowing synonyms into the search results reduces some biases, it is far from a perfect solution. 81% of current HR professionals say their current practices are average or worse. 

Some companies take recruiting software further and include artificial intelligence in their process. Let Amazon serve as a lesson for how to incorporate AI badly; in 2018, Amazon scrapped its recruiting AI after 10 years of use because it taught itself to prefer male applicants and penalize resumes that included the word “women” or all-female colleges. AI is trained by processing data and identifying patterns. If the humans using its system show bias in who they select, the AI incorporates that bias into its algorithm.

How can one use AI to hire in an unbiased way? Resumes in its system should be stripped of bias-contributing factors like age, gender, and names. Instead, factors like interests, traits, and skills should be left in for consideration. The world doesn’t change until the people inhabiting it do. That starts with tackling bias in recruiting.

Embracing Diversity: The How & The Why With Help From AI