The A.I./human partnership: Addressing unconscious bias and “assessment resistance”

Picture an organisation where every hire is based on a candidate’s competence and potential – on the contents of the book and not the cover – and never influenced by conscious or unconscious biases. This is the vision for leaders everywhere, but it’s a goal that we as organisations – and as human beings – all too often fall short of reaching.
Human resources teams have long focused on hiring the very best person for each job from the most diverse candidate pool possible. Pre-hire written assessments were created decades ago to enable us to incorporate scientifically backed data such as aptitude and skills scores into the screening process. These assessments had a positive impact on recruiting and are considered essential for many types of positions.

“Mini-me,” preemptive assessments, and resume blindness

Yet there are still significant hurdles for both recruiters and candidates as we work toward greater diversity and consistent selection of best-fit candidates. Recruits sometimes drop out when faced with a two-hour written assessment, which is often required early in the process, at the point at which they’re still exploring the company and are not yet committed.
[bctt tweet=”In a utopian world, candidates are hired based on competence, but reality is not the case.” username=”ATCevent”]
Recruiters still sift through paper resumes, sometimes at a volume and pace that causes all of them to begin to look and sound alike to the reviewer. When time is short and the stack of resumes is high, is it any wonder that we sometimes purge based on conscious or unconscious biases such as grade point average (GPA), prestige of schools attended, or gaps in employment? Surprisingly, research shows that all three of those factors – GPA, employment gaps, and school attended – fail as predictors of future job success.
One other area of unconscious bias is one to which all of us might likely have fallen prey at one time or another, and that is our natural inclination as managers to select people similar to ourselves to work with, rather than basing the selection process purely on qualifications or potential.
Finally and perhaps most surprisingly, even standardised tests show some bias when we examine pass rates. Clearly, a better way is needed for lessening bias as much as possible in the hiring process.

Expert technology + expert people: A powerful combination

A combination of recruiter expertise and scalable A.I. screening and decision support may help us leap over these inevitable stumbling blocks to get much closer to that ideal hiring environment, whereby everyone is considered for the job on a level playing field based on qualifications and potential to perform the job. Both people and technology are needed, with a productive division of labour allowing repetitive decisions involving big data at high speed to be given to the A.I. “assistant.” Those requiring interpersonal skills and professional judgment on a more relaxed timeline are best performed by skilled hiring-team members.
One children’s hospital had an ongoing challenge with too many applicants applying for jobs for which they were not a good fit, and many who did not have sufficient understanding of medical terminology. The traditional process of scheduling face-to-face screening interviews, bringing candidates to the hospital and performing the interviews was time-consuming and costly. To address this issue, the hiring team began a program that allowed nursing grads to submit a general video profile that would allow recruiters to screen the candidates with a better understanding of them as whole people, and to match them to job openings.
After hiring one candidate with an impressive enthusiasm and desire to help the hospital’s child patients, the hiring team discovered that she had previously applied to work at the hospital eight times and had never been contacted, much less called in for an interview. They realised that their previous hiring process was actually weeding out Talent with some of the specific qualities for which they were searching. A resume or online application form rarely offers an opportunity to express one’s desire to help and serve others, but these traits came across clearly in the video submitted by the candidate.
Dynamic interviews such as the hospital now offers, which began their process with video submissions, can help organisations expand their interview pool and look at more diverse candidates. Much like widening the aperture of a camera, firms that use this technology have been able to expose hidden talent that might otherwise have gone unnoticed by the initial scan of the ATS system.
In addition to allowing candidates to express their personalities and show off their interpersonal skills through video, some companies are also employing A.I.-driven assessments that evaluate these video interviews on thousands of characteristics and factors that match those possessed by their current top performers for particular jobs.
[bctt tweet=”Scalable A.I. & recruiter expertise can be a potent mix for building an ideal hiring environment. ” username=”ATCevent”]

What about algorithmic bias?

Recently a spate of headlines have introduced the idea of algorithmic bias based on a concern that bad data science is creating algorithms that simply replicate our own human biases and make them more scalable. While it is true that it’s all too possible for people to build their biases into machine learning/A.I. algorithms, a commitment to ethical development of those algorithms using scientific oversight and ongoing validation can prevent this from occurring.
For example, after creating an algorithm intended to provide first-screen decision support for recruiting professionals, a validation study of the interview data should be performed as a matter of course. If bias becomes apparent during that study, data scientists and IO psychologists can work together to identify the data points that caused the discrepancy and remove them from consideration.
By beginning the hiring process with this type of neutral and data-based assessment, we can substantially reduce bias from essential first-pass screening decisions. Inside top global companies today, the A.I./human partnership in HR is casting a wider net, minimising the importance of the resume, and making it possible to discover hidden talent that might otherwise have gone unnoticed.
Cover image: Shutterstock

This article is contributed by Hirevue.


Nathan will be sharing more details on how machines can complement human recruiters to create an ideal hiring environment at the upcoming ATC2018. If building a world class workforce is important to you, then this is a session not to be missed. Limited tickets left, find out more here.

Related articles

Leave a Reply

XHTML: You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

Sign up to our newsletter

Get a weekly digest on the latest in Talent Acquisition.

Deliver this goodness to my inbox!