Can you automate prejudice?

robot-recruitmentBig data is, well, big.  It’s application is creeping into more and more of our daily lives as companies look to utilise the masses of data generated by us to improve, and in many instances, automate their dealings with us.

Recruitment is one area that has certainly taken this approach on board.  Companies are increasingly making use of algorithms to sort and sift through job candidates, weeding the list down significantly before a human being even enters the fray.

A British university has a salient lesson to teach us however on how such an approach can go rather terribly wrong.  Staff at St George’s Hospital Medical School wanted to improve their admissions process, so wrote an algorithm to automate the first round.  The algorithm was based upon historical patterns, so they used data from candidates historically rejected to form an identikit of people they should filter out in the first round.

Whilst there are clear concerns about people being chosen by computers rather than people, a more worrying trend was spotted.  It was found that the admissions data used was bias against both female applicants and those with non-European sounding names.

It’s a salient lesson at a time when the likes of LinkedIn are using similar algorithms to make recommendations to recruiters on candidates they may wish to consider based upon past recruitment choices by that consultant.

Whilst there may be practical reasons to hand over part of the recruitment process to computers, this will hopefully provide the reminder that this most human of endeavours still requires the human touch.

Related

Facebooktwitterredditpinterestlinkedinmail

2 thoughts on “Can you automate prejudice?

Leave a Reply

Your email address will not be published. Required fields are marked *

Captcha loading...