- Hiring managers favor candidates from higher socioeconomic backgrounds, according to the findings of Yale University researchers.
- In one of five studies on which the research was based, people with hiring experience listened to audio or read transcripts of 20 candidates who interviewed for an entry-level management position in a Yale lab. The recruiters who listened to audio descriptions were more likely to detect applicants' socioeconomic backgrounds, as opposed to those who read the transcripts. They also were more likely to view those they perceived to be from the upper class as the most suitable for the job and the best fit, in addition to assigning them higher salaries and sign-on bonuses than other applicants without reviewing credentials or resumes.
- "Our study shows that even during the briefest interactions, a person's speech patterns shape the way people perceive them, including assessing their competence and fitness for a job," Michael Kraus, assistant professor of organizational behavior at the Yale School of Management and co-author of the research, said in a news release. "While most hiring managers would deny that a job candidate's social class matters, in reality, the socioeconomic position of an applicant or their parents is being assessed within the first seconds they speak — a circumstance that limits economic mobility and perpetuates inequality."
It's widely recognized that everyone has unconcious biases. In fact, individuals are biased about many things other than socioeconomic status and protected characteristics like race and sex; a hiring manager can have a bias against a candidate's coffee order or shirt color. To address unconscious bias, some are turning to technology. One of the latest tools is driven by artificial intelligence, which developers at Penn State University and Columbia University said can detect discrimination based on race and gender in hiring, policing, pay practices, academic admissions and consumer financing.
But such tech isn't for a far-off future; employers are beginning to adopt such measures now. Fast-food giant McDonald's, for example, began using Textio earlier this year to speed up the recruiting and hiring process and make it more inclusive. The partnership developed software that uses Texito's augmented writing platform to help McDonald's hiring managers write corporate brand language and draft gender-neutral job postings. Bias, or the perception of it, often shows up first in job descriptions. To eradicate it, experts say employers should avoid words or terms that signal bias against applicants based on race, sex, national origin, age or ability.