A day after interviewing for a part-time job at Target last year, Dana Anthony received an email informing her that she didn’t make the cut.
Anthony doesn’t know why – a common situation for most job seekers at one point or another. But he had absolutely no idea how the interview went, as his interviewer was a computer.
More job seekers, including some professionals, may soon have to accept impersonal online interviews, where they never speak to another human being, or know what the artificial-intelligence system behind the scenes can do to make hiring decisions. is affecting. Demand for online hiring services, which interview job applicants remotely via laptop or phone, soared during the COVID-19 pandemic and remains high amid an alleged worker shortage as the economy opens back up.
These systems claim to save employers money, remove hidden biases that can affect human recruiters and expand the range of potential candidates. Many people now use AI to assess a candidate’s skills based on what they say.
Anthony likes to look an interviewer in the eyes, but she only sees his own face on screen. “I interview better in person because I’m able to develop a relationship with the person,” she said.
But experts question whether machines can accurately and objectively judge a person’s character traits and emotional cues. Algorithms are tasked with figuring out who is best suited for the job, if they are taking cues from industries where racial and gender inequalities are already prevalent.
And when a computer screens some candidates and elevates others without explanation, it’s hard to know if it’s evaluating properly. For example, Anthony couldn’t help wondering whether her identity as a black woman influenced the decision.
“If you apply for a job and are rejected because of biased algorithms, you won’t know for sure,” said researcher Aisleen Kelly-Leith from Oxford University. In a face-to-face interview, by contrast, a job seeker may take discriminatory cues from the interviewer, she said.
New rules proposed by the European Union would subject such AI hiring systems to stricter regulation. Advocates have pushed for similar measures in the US
One of the leading companies in the field, Utah-based Hireview gained notoriety in recent years for using AI technology to assess cognitive ability from an applicant’s facial expressions during interviews. After criticism centered on the scientific validity of those claims and the potential for racial or gender bias, the company announced an end to the practice earlier this year.
But its AI-based assessments, which rank applicants’ skills and personalities to flag the most promising for further review, still consider speech and word choices in their decisions.
The privately owned company helped create a market for “on-demand” video interviews. Its known customers include retailers such as Target and Ikea, major tech companies such as Amazon, banks such as JPMorgan and Goldman Sachs, oil giants, restaurant chains, supermarkets, airlines, cruise lines and school districts. The Associated Press reached out to a number of brand-name employers using the technology; Most refused to discuss it.
HireVue CEO Kevin Parker says the company has worked hard to ensure that its technology will not discriminate based on factors such as race, gender or regional accent. Its system, which translates speech to text and teams, can weed out human interviewers for clues about orientation, adaptability, dependability and other job skills, he said.
“What we’re trying to change is people’s gut instinct,” he said – naturally – in a video interview.
HireVue says it interviewed more than 5.6 million people worldwide in 2020. Supermarket chains used it to screen thousands of applicants a day in the midst of a pandemic-fueled hiring for cashiers, stockers and delivery crews, Parker said.
Providers of broadly hiring-focused software, such as Modern Hire and Outmatch, have begun to offer their own video interviewing and AI assessment tools. On its website, Outmatch cited its ability to measure “the soft skills your candidates and employees need to be successful.”
HireVue notes that most customers don’t actually use the company’s AI-based assessments. For example, the School District of Atlanta has used HireVue since 2014, but it says it relies on 50 human recruiters to obtain recorded interviews. Target said the pandemic prompted it to replace in-person interviews with HireVue interviews, but the retail giant told the AP that it relies on its own employees — not HireVue’s algorithms — to pre-record To view and rate the given videos.
None of this was clear to Anthony when she sat in front of a screen to interview for a seasonal job last year. She dressed up for the occasion and settled in a comfortable place. The only indication of human presence came in a pre-recorded introduction explaining what to expect—for example, given that she could delete an answer and start over.
But she had no way of knowing what kind of impact she was creating. “We are unable to provide specific feedback regarding your candidacy,” Target’s rejection email said. She was rejected again in December after completing a higherview interview for a different job.
Anthony, who received his master’s degree in strategic communications last year at the University of North Carolina at Chapel Hill, said, “I understand companies or organizations are trying to be more aware of the time they recruit and the finances they spend. Huh.” Still, the one-sided interviews made her uneasy about who or what was evaluating her.
Kelly-Leith said this disqualification is one of the biggest concerns about recruitment and the rapid development of complex algorithms in recruiting.
In one infamous example, Amazon developed a resume-scanning tool to recruit top talent, but abandoned it after finding preferred men for technical roles — as it pitted against the company’s own male-dominated tech workforce. Was comparing job candidates. a Study released in April found that Facebook serves different job ads to women and men in a way that may violate anti-discrimination laws.
Governments across the US and Europe are considering potential investigations into these hiring tools, including requirements for external audits to ensure they do not discriminate against women, minorities or people with disabilities. Proposed EU rules, unveiled in April, would force providers of AI systems that screen or evaluate job candidates to meet new requirements for accuracy, transparency and accountability.
HireVue has begun phasing out its face-scanning tool, which analyzed expressions and eye movements and has been ridiculed as “pseudoscience” by academics, a slanderous and racist 19th-century Reminds me of the principle of phrenology. The Electronic Privacy Information Center filed a complaint with the Federal Trade Commission in 2019, citing a Hireview executive who said that 10 percent to 30 percent of a candidate’s scores were based on facial expressions.
“It was adding value related to the controversy it was creating,” Parker told the AP.
HireVue also released parts of a third-party audit that examined fairness and bias issues surrounding its automated tools. A published summary recommended minor changes such as revising the weights given to disproportionately short answers given by minority candidates, especially to short answers.
Critics welcomed the audit but said it was only a start.
“I don’t think the science really supports the idea that speech patterns would be a meaningful assessment of one’s personality,” said Sarah Myers West of New York University’s AI Now Institute. For example, she said, such systems have historically had trouble understanding women’s voices.
Kian Betancourt, 26, who is pursuing her doctorate in organizational psychology at Hofstra University, also failed a remote highreview interview for a consulting position earlier this year. He admitted that he would have tried very hard to predict how the system would evaluate him for a consulting job, formulating his idea to include keywords that he thought could increase his score.
While Betancourt favors a “structured interview” involving a standard set of questions, he is troubled by the ambiguity of automated systems.
“Tell people how we’re being evaluated, even if it’s something as simple as, ‘This is an AI interview,'” he said. He said basic information can affect how people present themselves.
Credit: www.nbcnews.com /