Artificial intelligence promises to help streamline hiring process, but not everyone is buying in. Your next job interview may feel a lot like an audition tape.
On a screen, a manager pops up in a recorded video and asks a question like: “Why do you want a career in this field?”
You have 10 seconds to think, and just a minute and a half to record your response on webcam.
Everything you say, how you say it — even where your eyes wander — is being analyzed. And not by a human, but by an algorithm. It evaluates whether you have the necessary skills and rates you on how well you fit the job.
Cold and calculated, yes — but potentially fairer than how humans hire now.
Companies have thousands of applicants knocking on their door, says Jahanzaib Ansari, CEO of Knockri, a Toronto startup that uses artificial intelligence (AI) to help companies like IBM hire new talent.
“It’s just not humanly possible for them to go through each and every one, [so] a lot of organizations resort to a lottery process. A process that actually misses out great talent.”
Through algorithms, Knockri claims to fill a job faster, more efficiently and with reduced unconscious bias.
What’s in a name?
Ansari believes he was overlooked when job-hunting a few years ago. Broke and desperate, he sent out resumes but didn’t get any callbacks. Frustrated, he was given blunt advice: Try a different name.
“I changed my name to Jason for a couple of the resumes, to Jay … just to, like, test it out,” Ansari says. “And within four weeks, I got a job.”
Ansari was both surprised and disappointed the tactic worked. But Canadian research suggests name-based discrimination is very real, across big and small employers alike.
A 2017 study re-examined the findings of a large-scale Canadian employment audit, where more than 12,000 resumes were sent to 3,225 job postings. It found Asian names — some examples in the study included Tara Singh and Lei Li — were 28 per cent less likely to get a callback than Anglo-Canadian names such as Emily Brown, even with equivalent qualifications.
In fact, the research suggests Tara Singh needs an extra degree to even the odds.
Blind recruitment works well in some cases. For example, in symphonies in major cities, it has led to more women in male-dominated orchestras. But it’s easier to evaluate a musical performance than things like a job candidate’s emotional qualities and problem-solving skills.
AI shortlist
In the end, Ansari — under the name “Jason” and “Jay” — took the job he was offered. But he soon left, disillusioned by this problem of discrimination in hiring.
Along with two partners — Maaz Rana and Faisal Ahmed — he co-founded Knockri.
The company creates short lists for employers looking to fill soft-skill jobs such as consultants and people who deal with clients. Its AI tool analyzes video responses using facial and speech analysis. Candidates are rated on how well their answers fit certain job attributes, like confidence or collaboration.
The short list is given to employers with no names or faces attached — only scores. Knockri says its lists are more diverse, with 17 per cent more people of colour and six per cent more women, compared to traditional hiring practices.
Knockri isn’t alone. Companies like Pymetrics and Hirevue have gained traction in the U.S. with big clients like Unilever and Hilton.
Still, not everyone buys the futuristic promise of AI in hiring.
“It seems very pernicious,” says Solon Barocas, an assistant professor in the department of information science at Cornell University, “to expect that the kind of signals we affect with our face [are] a reliable indicator of some fundamental human truth about our competencies and capabilities.”
Beyond doubts about facial expressions, there’s a risk in the way we teach a machine what’s “desirable.”
“Many of these systems are based on trying to learn patterns from historical examples,” Barocas says. “And because these examples are going to come from some previous process involving human discretion and human choices, [they] are going to probably feed forward many of the same kinds of biases that this system is ostensibly supposed to eliminate.”
For example, you may teach an AI algorithm to look at past top performers to define a “good” employee. But if that group doesn’t feature women or minorities, the machine may only qualify white males.
Knockri says a diverse team, along with a diverse data set, has trained its AI to catch major and subtle behavioural differences.
“We’ve made sure that in our data set, we have a wide variety of gender, race, ethnicity, accents, sexuality,” Ansari says.
“We’ve also implemented checks and balances,” partner Maaz Rana adds, “where if the algorithm begins analyzing something that seems foreign or unfamiliar to it, it will red flag those videos so that a person can take a look at it.”
Outsourcing responsibility
In the end, though, Knockri doesn’t make the final determination of who gets an interview. The company uses their short list to choose the candidates it wants to meet.
And even a balanced and well-taught AI is still subject to the wants of the business world, where efficiency and corporate identity can often overtake more responsible priorities.
“I think that there are also instances where organizations don’t really care about being biased,” says author and technology critic Sara Wachter-Boettcher.
Companies could be using these tools simply so that there’s “a perception that they’re being less biased,” she says.
“You’re outsourcing the responsibility to the machine, right? It’s like, ‘I didn’t personally do anything … the machine decided not to move forward with the candidate.’ And so there’s an easy way to distance yourself from having a responsibility for any harm that can be caused.”
And while a company could use an AI to fight bias in hiring, some critics say it’s irresponsible to bring employees into a work environment that hasn’t taken the steps to adequately foster inclusivity.
“Women, particularly, may be brought in as a result of this,” warns tech entrepreneur Saadia Muzaffar. “But have you done the work of revising pay equity? Have you done the work of making sure that if there are issues of abuse of power or sexual harassment, that there are policies in place where these women can go?”
For their part, Knockri’s founders seem to understand the messy, complicated nature of inclusivity in the workplace — an issue that affects people before, during and after the hiring process.
As Muzaffar puts it: “All of these things are people things — and software is not going to help with that.”