AI technology is a double-edged sword in most use cases. Within recruiting, AI can help introduce efficiencies and eradicate certain time-consuming tasks. However, the software can also create new – sometimes serious – challenges to be aware of: 

  • AI needs a lot of data to be accurate 

Machine learning (the component of AI that allows algorithms to be improved) requires a lot of data to accurately mimic the intelligence of humans. For example, AI that’s used to screen applications would need to screen potentially hundreds of thousands of resumes for a specific role to be as accurate as a human recruiter. Its intelligence is always limited to the data source available, therefore at first, the AI tool may be less than helpful, and even potentially biased. 

  • AI can learn bias from previous data 

Companies that create AI recruitment software often share how AI can eliminate bias from the hiring process through its use of factual information, rather than the subjective, and sometimes biased decisions found in human evaluations. However, saying AI can eliminate bias is avoiding a large part of how AI works – it’s trained to find patterns in previous behavior. As mentioned above, AI extracts insights from large amounts of data, then makes predictions based on its findings. This is what makes AI recruiting so powerful, but it can also make its algorithms heavily susceptible to learning from past biases. 

For example, if a company has more male than female employees, an AI-powered tool can easily favor male candidates to match the current identity of the company, so long as there isn’t a regularization term to stop the system from doing so. In a harder-to-detect example, say many employees graduated from the same university. This could be due to its proximity, or because of a referral program. The AI software could notice this trend, and form a pattern to favor graduates of that university or those with similar backgrounds. This pattern could end up being highly discriminatory towards non-college grads and certain demographics that were less likely to attend that specific university. 

  • AI lacks the human touch 

It goes without saying, but humans are complex. AI can screen a candidate’s skills and abilities that are relative to the role, but the system would struggle to analyze many aspects of a candidate’s emotional intelligence that could help them succeed in the company. For example, an AI interviewing platform that analyzes facial expressions and tone of voice along with the candidate’s response isn’t able to determine exactly what a smile and a formal tone mean – does it mean the candidate is sincere and serious? Or possibly, they’re trying to be friendly but their tone makes them seem distant? Perhaps it also depends on the question asked. AI doesn’t have the technology to fully understand the nuances of social cues, and cannot possibly allocate these features to imply the presence or absence of specific skill sets. 

Secondly, AI cannot build a rapport with a candidate. As we’re currently experiencing a candidate-driven market, companies need to be able to truly connect with top talent – failure to do so could result in high-candidate drop off. In order to win them over, recruiters need to show interest and empathy, and remember details from previous conversations – even if AI could replicate these traits, a system would entirely lack authenticity. 

  • AI can misinterpret human speech 

AI recruiting tools that screen, interview or evaluate applicants will use automated speech recognition (ASR) software that’s also used in voice recognition services. This software listens to the applicant’s spoken response and converts the voice data into computer-readable text data. In theory, this allows companies to rely on AI to capture a candidate’s complete response and evaluate them fairly and objectively. 

However, anyone that’s used leading voice recognition services, such as Alexa, Siri, or Google will know that not every word is interpreted correctly. 

If the leading ASR systems can’t always recognize and contextualize voice commands, how can an AI software company, with far less funding, create an algorithm that can properly analyze lengthy and often complex interview responses? Unfortunately, they can’t. Even a leading AI-driven interviewing provider states that their software has a word error rate (WER) of ‘less than 10%’ for native American English speakers – so about 1 in every 9 or 10 words are incorrectly translated.  

This means that in an AI-powered interview, the software will fail to understand at least approximately 10% of a candidate’s response, and is likely to misinterpret up to a quarter of the response from a non-native English speaker. 

  • It can be hard to get buy-in 

Not everyone is interested in using AI within the recruitment process – many companies are comfortable with traditional, or less intrusive, hiring methods and aren’t looking for a change. Additionally, candidates are often hesitant to complete an AI-based interview: 

  • Push-back from HR teams 

Whenever people are asked to make technological advances in their processes – even if they’re told it will make their lives easier – it comes with an inevitable push-back. Any change requires more training and new processes. Additionally, recruiters may be hesitant to embrace AI as they’re fearful of their jobs becoming automated and, ultimately, obsolete. 

  • Failing to win over candidates 

There’s a lot of conflicting information regarding how candidates feel about AI in recruiting. One survey from 2016 (which is still heavily credited by AI-powered recruitment companies today) asked 200 candidates how they felt about AI recruitment: 58% of candidates were comfortable interacting with AI technologies to answer initial questions. However, in 2018 a survey of 2000 Americans reported that 88% would feel uncomfortable if an AI interview application was used during their candidate screening process. 

For an authentic, up-to-date response, it doesn’t take long to find direct comments on forums such as Reddit, with the vast majority providing negative feedback. Users call AI interviewing “dehumanizing”, “the worst interviewing experience”, and “a pure waste of time”. 

The full blog post was written by Vid Cruiter and published on their website here.