We’ve said it before, hiring is hard. Not only is it hard, but it’s also time-consuming and important to get right. No pressure.

In an effort to ease the burden of getting the right person in the right seat, some harried hiring managers are turning to Artificial Intelligence, or AI, for some parts of the hiring process. Sorting software sifts through all of the resumes submitted for your positions and presents the most promising ones to be vetted by a human.

This could save quite a bit of time. Hiring managers are often faced with many resumes submitted by candidates who have zero qualifications for a position. Each resume takes time to review, even if it gets rejected almost immediately.

AI Software Can Discriminate

However, a Harvard Business School paper states that 88% of employers who are using an Applicant Tracking System (ATS) say that qualified candidates were rejected by the software because they didn’t match the exact criteria in the job description. The software also uses benchmarks such as a college degree or a gap in employment to further pare the list of viable candidates. In the world of hiring after the pandemic lockdowns, that results in even more qualified candidates being rejected before they even get into the system.

A report by the Brookings Institution on auditing employment algorithms for discrimination stated that the speech recognition models demonstrated a clear bias against African Americans and had problems with dialects and regional speech variations. Facial recognition software also showed disparities across skin color and can be problematic for people with disabilities.

AI Software Can Still Be Helpful

Sifting software and predictive algorithms can be used fairly. Large companies can continually vet the software they are using to make sure that viable candidates aren’t being sidelined, but smaller companies may not have that luxury.

There are things that companies can do to avoid the AI traps.

Review and change the inputs that are fed into the hiring programs and algorithms. Make sure the inputs are job-related and that they promote diversity. Review the outputs to be sure that they follow privacy and data governance rules. Create standards to follow to ensure that the algorithms are closing in on a neutral bias.

Consider auditing your automated tools either in-house or by using a third party, to ensure that the algorithms are adhering to the standards you have set up.

Be transparent and let applicants know that AI will be used to analyze their applications, resumes, and interviews.

Algorithms and AI can be helpful to hiring managers and to candidates alike if they are used responsibly and audited regularly to reduce systemic bias. Recruiting agencies that specialize in the type of talent you want to hire may also be more proficient in using software tools to screen candidates without bias. Employers will find that there is less of a labor shortage when the unfilled positions are fully open to all potential candidates.