Breaking down bias: How recruitment tech is making hiring more fair
In the weeks leading up to the U.S. presidential election, race and gender bias is on nearly everyone’s minds. Women still don’t earn as much as men. Plus-sized people are still being treated badly at work. Our LGBT community still faces discrimination. Bias still occurs, often in subtle ways in America — and often, employee recruitment is one of its most apparent forms.
Making workplaces better by reducing bias in recruitment
Technology has paved the way for reducing bias in hiring practices that will filter down to the way organizations are run so that all working people can experience true equality. It is possible that recruitment practices can help eliminate bias in a modern world.
In September, HR Dive covered a few stories about bias in recruitment and how it’s usually unintentional. Companies are investing in technology, like Artificial Intelligence-enabled applicant tracking systems and machine learning that reduce the human element in hiring. It’s ironic to think about the very human process of recruitment needing a system of non-human checks to reduce bias. However, when used right, technology can not only help reduce unconscious bias, but can actually help people discover their own bias so they can be mindful of this problem.
Recently, at New York’s Advertising Week, media agency MEC Global decided to launch an initiative to challenge multiple industries to celebrate human differences and increase visibility of diversity. Using Harvard University’s Implicit Association Tests, they had counselors on hand to facilitate their Brave your Bias Industry Challenge, to highlight the often unconscious bias that all people have to some degree.
Recruitment is such an intimate experience, for both recruiters and candidates. It can be difficult to leave personal values, thoughts, and attitudes at the door. Even though recruiters and HR professionals are held to higher standards and must adhere to employee equality laws, there can be tiny influences present on a subconscious level. Science calls this implicit bias, and it stems from deep associations learned during childhood. A study published in the Journal of Neuroscience advised that, “the brain’s neurons designed to respond to sex, race, and emotion are linked by stereotypes, which distort the way people see others.” Much of this goes on without our conscious brain ever noticing.
What can recruiters do to reduce bias in recruitment processes?
“Recruitment systems are outdated,” says Rob Biederman, founder and CEO at Catalant, a software-as-a-service company that connects industry experts with Fortune 1000s. “Often the best looking, most visible candidates get noticed for career opportunities, when there are many others who have track records of outstanding performance that don’t fit pre-conceived molds.”
Catalant’s approach is a candidate matching system that focuses on what matters the most – true candidate merits. Beiderman also advises that recruiters must not neglect freelance contractors, a large sector of the workforce that has much to offer but may go ignored by traditional 9 to 5 companies.
Recruiters can reduce bias in hiring when they stop making their decisions based on their feelings, and base their decisions on the work-related value that each candidate brings to the table. For example, in the information technology market, there are a large number of introverts that are typically drawn to this kind of work. An unprepared recruiter may completely ignore the successful track record of a candidate merely because he or she doesn’t make good eye contact or laugh at the recruiter’s jokes during the interview.
An example of this type of work succeeding: Microsoft's program for interviewing and recruiting candidates with autism. A combination of technology and empathic recruiting processes that spread out the typically demanding interview schedule to accommodate for their needs lead to great success in recruiting talented new employees.
Technology can help to remove much of the bias in recruitment too, as previously mentioned. Recruitment systems use computer algorithms to screen candidates and match them to certain job types. If recruiters use them properly, they can end up with a more accurate, less biased, list of candidates to bring in for interviews.
Bias will likely never go away completely. But, by using the proper tools, unintentional harmful bias can be reduced dramatically, opening up more opportunities for those previously left out.
Follow Tess Taylor on Twitter