Date: June 21st, 2025 10:12 AM
Author: scholarship
A.I. Sludge Has Entered the Job Search
Candidates are frustrated. Employers are overwhelmed. The problem? An untenable pile of applications — many of them generated with the help of A.I. tools.
Katie Tanner, a human resource consultant in Utah, knew the job would be popular: It was fully remote, was at a tech company and required only three years of experience.
But she was still shocked by the response on LinkedIn. After 12 hours, 400 applications had been submitted. By 24, there were 600. A few days later, there were more than 1,200, at which point she removed the post. Three months later, she’s still whittling down candidates.
“It’s crazy,” she said. “You just get inundated.”
The number of applications submitted on LinkedIn has surged more than 45 percent in the past year. The platform is clocking an average of 11,000 applications per minute, and generative artificial intelligence tools are contributing to the deluge.
With a simple prompt, ChatGPT, the chatbot developed by OpenAI, will insert every keyword from a job description into a résumé. Some candidates are going a step further, paying for A.I. agents that can autonomously find jobs and apply on their behalf. Recruiters say it’s getting harder to tell who is genuinely qualified or interested, and many of the résumés look suspiciously similar.
“It’s an ‘applicant tsunami’ that’s just going to get bigger,” said Hung Lee, a former recruiter who writes a widely read newsletter about the industry.
Enter the A.I. arms race. One popular method for navigating the surge? Automatic chat or video interviews, sometimes conducted by A.I. Chipotle’s chief executive, Scott Boatwright, said at a conference this month that its A.I. chatbot screening and scheduling tool (named Ava Cado) had reduced hiring time by 75 percent.
HireVue, a popular A.I. video interview platform, offers recruiters an option to have A.I. assess responses and rank candidates.
But candidates can also use A.I. to cheat in these interviews, and some companies have added more automated skill assessments early in the hiring process. For example, HireVue offers A.I.-powered games to gauge abilities like pattern recognition and working memory, and a virtual “tryout” that tests emotional intelligence or skills like counting change. Sometimes, Lee said, “we end up with an A.I. versus A.I. type of situation.”
Applicants using fake identities pose another problem. In January, the Justice Department announced indictments in a scheme to place North Korean nationals in IT roles working remotely at U.S. companies. Emi Chiba, a human resource technology analyst at Gartner, told DealBook that reports of candidates who used fake identities had been “growing and growing and growing.”
A report that Ms. Chiba published with other Gartner analysts in April ballparked that by 2028, about one in four job applicants could be made up. Among its recommendations was that companies deploy more sophisticated identity-verification software.
Some recruiters say posting isn’t worth it. To address the problem, LinkedIn recently added tools to help both candidates and recruiters narrow their focus, including an A.I. agent, introduced in October, that can write follow-up messages, conduct screening chats with candidates, suggest top applicants and search for potential hires using natural language.
A feature that shows potential applicants how well their qualifications match up with a job description, which LinkedIn introduced to premium subscribers in January, reduced the rate at which they apply to “low match” jobs by 10 percent, according to the company.
Hazards abound. Concerns that using A.I. in hiring can introduce bias have led to lawsuits and a patchwork of state legislation. The European Union’s A.I. Act classifies hiring under its high-risk category, with the most stringent restrictions, and while no U.S. federal law specifically addresses A.I. use in hiring, general antidiscrimination laws can potentially come into play if the result of any process is discrimination.
“You’re not allowed to discriminate, and of course most employers are trying not to discriminate, but easier said than done,” said Marcia Goodman, a partner at Mayer Brown who primarily represents employers.
Is this a perpetual cycle? The problem is less that candidates are using A.I. — a skill many employers say they want — than it is that they’re being sloppy. Alexa Marciano, the managing director of Syndicatebleu, a recruiting agency, said job seekers were reacting to recruiters’ use of automated screening. “It’s really frustrating for the candidates because they spend all this time creating very catered cover letters, very catered résumés,” she said.
Jeremy Schifeling, a career coach who regularly conducts technology-focused job-search training at universities, said he could see this back-and-forth going on for a while. “As students get more desperate, they say, ‘Well, I have no choice but to up the ante with these paid tools to automate everything.’ And I’m sure the recruiters are going to raise the bar again.”
He argues the endgame will be authenticity from both sides. But, he said, “I do think that a lot of people are going to waste a lot of time, a lot of processing power, a lot of money until we reach that realization.”
https://archive.is/xlRMg
(http://www.autoadmit.com/thread.php?thread_id=5741202&forum_id=2...id.#49037149)