While the pandemic caused a massive spike in the unemployment rate, it’s recently fallen below 6 percent again. While a low unemployment rate might prove good for job hunters, it’s always a struggle for employers and recruiters. The fewer people looking for work, the fewer top-tier candidates you can expect to find.
On top of that, many employees go into a position expecting they’ll change jobs sooner than later. That normal churn puts even more pressure on HR departments to find more and better candidates. To help combat these challenges, some businesses use AI-driven recruiting software to streamline the process.
Unfortunately, that software may end up creating legal trouble for you down the road. Keep reading to find out how.
AI-Driven Bias
One of the early selling points for automated and AI-driven software was the promise that machines don’t express bias. With study after study showing that hiring managers proved more likely to hire people who looked and sounded like themselves, this sounds fantastic.
The problem is that software development is a field dominated by white men. That means that the people writing the code for that software can end up programming in bias.
That bias is buried inside algorithms. Those algorithism live in countless lines of computer code. That situation makes discrimination hard to spot without careful analysis of the results.
Unintentional Discrimination
In some cases, the discrimination can prove subtle. You might see different rates of pass-through based on group characteristics. In other cases, the problem can arise based on well-meaning but unintentional discrimination.
Let’s say you want local candidates, so you set a maximum distance in the software for candidates’ homes. What if you set the distance too narrowly?
You can exclude low-income neighborhoods or areas with high immigrant populations. With that one setting, you can actively discriminate with zero malice.
The software itself just excludes based on the inputs it receives since it can’t contextualize the behavior.
You’re Probably Still Breaking the Law
Let’s say you use biased software that discriminates based on subtle coding issues or through careless settings. There is a good chance that you still violated anti-discrimination laws. While the court process is uncertain, it’s not in your best interest to become the face of discriminatory software.
If you’re considering ai-driven recruiting software, ask for data that backs up any claims that it doesn’t discriminate. You should also loop in your legal department to ensure you cover your bases.
Avoiding AI-Driven Legal Trouble
AI-driven software can streamline your recruiting process. It can also open the door to legal headaches later.
You should understand the risk of AI bias and require proof that a company’s software doesn’t suffer from that problem. You should employ some human oversight for quality control on a percentage of rejected candidates.
Finally, get your lawyers looped into the process early on so they can offer some legal sanity checks before you fully commit.
Qualstaff Resources specializes in recruiting for San Diego-area businesses. If you’re not sold on AI-driven software, contact Qualstaff today for your staffing needs.