Growing Legal Peril When Using Automation In Hiring
Many of the complaints I hear from job seekers is a lack of transparency when going to a site and filling out a job application. Where does that information go? It can, to some, be a black hole. Algorithms that are being used to screen candidates may be used to disqualify some very good talent. As a hiring manager, I experienced this first hand, companies are routinely using these tools to help streamline the selection process, sometimes to their detriment.
So it is not surprising that companies who are actually paying attention, like Amazon, in the above article, found out that the automated hiring tool they were attempting to make was biased against women. If a machine is learning your culture to see if an applicant can be successful in your environment and biases exist in your culture, the technology will only replicate those biases. Makes sense to me.
However, what I am intrigued by is the effect of automation on labor and employment law. For years there has been a lack of transparency in how companies use automation in their hiring practices. Who is actually developing these algorithms that are ultimately making the decisions? Are corporations using or developing the automation tool paying attention to the susceptibility of the software to biases? How do you fight biases in your results? Is is enough that the automation results in bias to prove discrimination?
Labor and employment attorneys are now going to have to ask some very technical questions about software development. As the article points out it is time to lift up the hood and figure this out now!