Do People Discriminate More than Algorithms?

There have been some high profile cases arguing that testing for hiring can be discriminatory. The New York Times recently had an interesting article about when algorithms discriminate, followed by an interview with a researcher about this subject.

In the interview, Cynthia Dwork mentions two outstanding examples of algorithms discriminating. Women received an ad for career coaching for jobs that paid over $200,000 far less than men in one study. Hypothetically, and a university which was originally segregated might discriminate against minority applicants if it used an algorithm designed based on historical information.

When algorithms reduce bias

However, algorithms are more useful for reducing discrimination. We’ve seen this time and again in Stellar’s work in Latin America (launched as Farolito) before beginning our US operations.

Our Latin America operations provide a unique comparison because the anti-discrimination laws are enforced more weakly. It’s not unusual for job descriptions to highlight the age range, marital status and preferred gender for a position. While Stellar did not reject applicants based on discriminatory characteristics, we did work with companies that had a history of doing so.

In one case, we had a client who specified a preference for salespeople that were 18-23, and told us they did not hire single mothers. Turnover was very high: 52% of all their hires failed to complete 3 months on job, which meant that nearly every position was filled twice in any given calendar year.

When we began evaluating all applicants with our proprietary evaluation and filter, we found that if our client hired applicants based on their psychometric profile and skillset instead they could increase the percentage of workers who stayed at least 3 months from 52% to 80%. In fact, the client could benefit from those improvements in their worker base just by hiring differently from the applicants they already received.

Why bias occurs

Why would a company fall into an outwardly discriminatory hiring process? It is most likely due to two behavioral biases, “salience” and “vividness”. This occurs when people incorrectly attribute one action to another characteristic or action (salience), just because it is easy to remember (vividness).

In the case of our client, it was easy to remember the one single mother who was unsuccessful in the position, because having a child was a memorable characteristic that our client already suspected was correlated with lack of success. Because the client had fewer employees older than 23, it was also easy to remember the one older worker who did not stay on the job for long. Those employees were actually leaving because they were not a good fit for the job, but the client misattributed the reason for their departure to those clients’ salient characteristics.

In reality, neither age nor having a child was correlated with success on the job. By setting aside discriminatory requirements and replacing job requirements with a filter that did evaluate important characteristics, the employer created job opportunities for people who had been the victims of discrimination in the past.

Algorithms, when implemented properly, can create opportunities for people who would not otherwise have them.

Sara Nadel is a co-founder of StellarEmploy.