Body Shortcuts Side Menu Shortcuts Shortcuts to Large Menu

Newsroom

Dr. Chihmao Hsieh’s contribution to the Maeil Business Newspaper

AuthorAdministrator REG_DATE2022.01.13 Hits619

AI and job interviews: Leveling the playing field

 

Written by Chihmao HSIEH

 

In the recent last few years, artificial intelligence has been implemented to handle some of the filtering processes involved in assessing job interviews. A growing number of companies is requiring job applicants to answer questions online via video camera, whereby computer programs then process the video clips and audio transcripts. Assessments of facial expressions, eye movements, voice intonation, and word choice are combined to come up with sets of scores for each applicant.  For now, it appears that most of the companies using this technology are using it only for the initial cut, thus capable of filtering applications easily from tens of thousands to mere dozens. Some AI interview software works by having current employees answer the questions on video, and then evaluating the candidates on how well they match those employees.

 

The criticism has been swift and wide-ranging. Besides the obvious criticisms related to tying voice intonation and facial expressions to personality or expected job performance, applicants also express unfairness about the unilateral nature of AI interviews, where applicants are unable to correct an AI’s perception in the same way that they can correct a human interviewer while face-to-face. Career development staff at universities lament that AI doesn’t know how to comprehend applicants’ value systems, or appreciate the work motivations stemming from their aspirations or passion. Researchers have found that AI systems are prone to making decisions that reflect racism or sexism, and have called to regulate AI to ensure transparency and accountability. AI also has severely limited ability to recognize humor, which is valuable within creative, entrepreneurial organizations. As a result, some companies such as Korea Airports Corp. have gotten rid of AI interviews because managers were unsure exactly how and what the AI were evaluating. Overall, policymakers have asked whether the cost savings offered by this technology are worth discounting this much of human value and dignity.

 

As with many technological advances, there are tradeoffs in using AI to assess job interviews.  At the surface, the cost savings are obvious and significant. As communication technology improves and more work goes remote, the physical boundaries of the workplace will open up. HR managers become less confined in recruiting from a local labor market, and companies can benefit from more candidates for each of their open positions. In processing all these extra applications, AI doesn’t suffer from fatigue or the broadest set of human biases. As many AI software chiefs have commented, AI’s purpose here is to complement human-led decisions, without substituting for them completely. Maybe we trust AI’s job placement ability in the future just as much as we trust Google’s search capability today.

 

But let’s take a step back: the ultimate goal here is to accurately match job seeker to open position. We can design the system with two strategies. In a first scenario, we facilitate the assessment of job applications. That’s what AI technology currently offers to us. In a second scenario, we help job seekers to foster taste in all the large and small companies hiring around the world. But how? I suggest that technology should scrutinize companies, as much as companies want to scrutinize job applicants. For example, a system could collect 24-hour real-time electronic data of employee behaviors at companies, and offer scores of each company’s ability to manage and support communication and collaboration. It might track the daily actual communication between employees, use that data to score the quality and challenge of all workgroups’ actual goals and projects, and offer weekly scores regarding the company’s level of creativity and bureaucracy. In this way, the burden then falls on companies to be more transparent about the quality of their daily internal work environment and the worklife of employees, besides simply relying on technology that robotically sifts through hundreds of thousands of resumes and video interviews.

 

These two strategies are not mutually exclusive. But the point is that we give some power back to job applicants and level the playing field. Although companies today have more applicants to choose from, applicants today also potentially have more companies to choose from. If a medium-sized company in the USA has opened up their boundaries to the world, then the world should also be able to identify and assess it. We need more employment policies and entrepreneurship acknowledging that job seekers deserve to identify the best companies that are fit for them, not just the other way around. Perhaps one day, job applicants can use AI to help them identify the thousands of companies around the world that are currently hiring, and then see a score that estimates the probability that they won’t like or are too talented for any given company.

 

Companies could simultaneously get the chance to send a strong signal regarding the quality of their philosophy towards humanity. Perhaps some companies would be willing to pay for the rights to be certified in this kind of applicant-friendly system. They could be proud that they not only welcome shortlisted finalists to interview face-to-face at their offices, but that they are also confident and humble enough to respect job applicants in helping them decide whether their company is good enough for them in the first place. On the other hand, if job applicants paid a nominal subscription fee to use such AI, it would have the added benefit of helping companies to identify those job seekers that signal seriousness about their job hunt (as we might observe with LinkedIn’s subscription service today). These kinds of business models bring some balance back to the equation. If companies use AI job interview software to assess candidates but aren’t willing to share daily or weekly AI data regarding their own internal environment or their own company culture, then they are largely hypocritical.

 

These kinds of issues regarding the direction of innovation weren’t matters of concern in the past. However, times are changing. As AI advances further and further, part of the system should still ensure that job candidates are valued with some basic standard of dignity, from a human perspective. But at the very least, candidates deserve to be on a similar playing field as companies. If that requires giving job applicants the power to leverage their own AI in assessing the desirability or fit of companies to work for, so be it. 

 

 

Note:

News story regarding Korea Airports Corp dropping AI interviews: http://www.koreaherald.com/view.php?ud=20211004000214

 

 

Click here to read the article