3 ways to start reducing bias in your hiring process
Westpac NZ’s Senior TA Manager Clare Montogomerie and Modern Hire’s Eric Sydell and Grant Meyer share their insights to help companies start the process to reduce bias in hiring.
Westpac NZ’s Senior TA Manager Clare Montogomerie and Modern Hire’s Eric Sydell and Grant Meyer share their insights to help companies start the process to reduce bias in hiring.
Did you know that half of all Australians were born overseas or had a parent who was? Did you know that more than 7.5 million people have immigrated to Australia since 1945? Did you know that Australians identify with approximately 300 different ancestries? Monday was Harmony Day (day 1 of Harmony Week), a day which…
Our current focus on bias, both overt and covert, brings up the question of culture fit. Is it as important as we seem to think? Numerous articles and videos tout that those who fit a corporate culture are more productive and stay longer. What research supports this, and more significantly, what definition of cultural fit are…
Bias has been a part of human existence since the first amoebas crawled out of the ooze, and it will never fully go away. Within modern business functions, everything is subject to some degree of bias, but recruiting might be the area where it is the greatest concern – two people evaluating each other (because,…
The recent announcements by IBM, Microsoft, and Amazon to shut down their facial recognition systems amid the Black Lives Matter protests have recast the controversial technology under the spotlight once again. From law enforcement to disease detection to cybersecurity, facial analysis systems are developing at an ever-increasing rate – and they are changing the world.…
In November 2019, the Victorian Parliament introduced the Australian-first Gender Equality Bill. The Bill aims to ensure gender equality in the public service is non-negotiable and will require employers to publicly report every two years on their progress in addressing the gender pay gap, sexual harassment and achieving equality in career opportunities. Although this Bill…
With the proliferation of AI and Gamified assessments in the HR tech arena in recent years, it’s easy to be dazzled by the allure of their promise to enhance candidate experience, increase process efficiency and drive improved quality of hire. But if you are thinking it all sounds too good to be true, this may…
With the proliferation of AI and Gamified assessments in the HR tech arena in recent years, it’s easy to be dazzled by the allure of their promise to enhance candidate experience, increase process efficiency and drive improved quality of hire.
But if you are thinking it all sounds too good to be true, this may be one case where the hype does meet reality, provided you do your homework.
Here are the three questions you should be asking when choosing your next assessment provider:
An assessment is still an assessment, so the traditional rules around validity and reliability that psychologists use to evaluate the rigour behind an assessment still apply.
Whilst Gamification and AI capture and use data differently, the measures still need to be reliable, in that they are stable over time, and they need to be valid, that is, we have evidence to show they are measuring what they are supposed to measure. Without this, we could be using unreliable data that is not particularly meaningful, to make some pretty important decisions.
All algorithms need to learn from a data set. The data set that an assessment provider trains their algorithm on is absolutely crucial. Algorithms work best when they are customised to your organisation and trained on a data set that reflects where it will be used and the challenges you are trying to solve for.
For example, if you are wanting to build a model for graduate hiring and you know that you get a spike in turnover at the two-year mark, you want to make sure that you are building a model based on data captured on your current high-performing grads who have stayed beyond the two-year mark. This way you are capturing what is unique about that population in your model, thereby selecting more high-performing graduates in the future who are likely to stay.
Also, keep in mind that algorithms have a shelf-life. They need to be refreshed at least annually to ensure their ongoing relevance. Does the vendor actually provide this service?
Finally, be careful around building models from CV or demographic information such as age, gender, postal address, or data collated through social media profiles. These types of data are inconsistent and have the potential to introduce unwanted bias into your selection process.
Bias in algorithms is a topic that has gained significant momentum in recent media, with government departments such as the Australian Human Rights Commission and the Personal Data Protection Council in Singapore, as well as the World Economic Forum all working to develop standards and guidelines for the ethical use of AI.
You want to understand from the vendor how they approach the mitigation of bias. For example, you will want them to consider the diversity of the data set upon which they build their models, whether they are front-end testing to ensure models are de-biased before they are deployed, and back testing for adverse impact.
You will also want to understand what tool the vendor is using to help them eradicate bias from their algorithms. The availability of open sourced bias-detection tools such as Audit-AI means they should have a clear plan as to how they are keeping their algorithms balanced and fair.
The use of gamification and AI in assessment, whilst relatively new science, is here to stay. We are looking forward to its continued evolution, and see that rather than just perpetuating a more efficient society as it stands today, its true potential is to make the world a fairer place – and that’s something we can all be proud of.
Cover image: Shutterstock
This article is contributed by pymetrics.