The Human Cost: What the Broken Jobs Market Does to Real People

2026-02-24 · 7 min read

In the first part of this series, we looked at the structural forces breaking the jobs market — AI displacement, signalling collapse, and the feedback loops making everything worse. But systems don't suffer. People do.

This is what the broken market actually looks like when you zoom in from the macro to the personal. When you stop talking about “labour market dynamics” and start talking about the person sitting at their kitchen table at 11pm, tailoring their forty-seventh application that week, wondering if anyone will ever read it.

The application black hole

Here is what looking for a job feels like in 2025.

You find a listing. You read the description carefully. You update your CV — again — to emphasise the specific skills they've asked for. You write a cover letter, trying to sound enthusiastic but not desperate, confident but not arrogant. You fill out an online form that asks you to re-enter every piece of information your CV already contains: your employment history, your education, your contact details. You upload the CV anyway, because the form demands it. You click submit.

Then nothing.

No acknowledgement. No timeline. No indication that a human being has so much as glanced at what you sent. You do this again tomorrow. And the day after that. And the day after that.

Job seekers routinely report sending hundreds of applications with response rates as low as 2-3%. That means for every hundred carefully prepared submissions — each one representing hours of research, writing, and emotional investment — ninety-seven or ninety-eight disappear into absolute silence.

The psychological toll of this is not trivial. It is corrosive. Job searching has become a full-time job in itself, one that pays nothing and offers no feedback whatsoever. The emotional weight of being ignored at scale — of pouring genuine effort into applications that vanish without acknowledgement — accumulates. It settles into your chest. It makes you question whether you're good enough, whether you're doing something wrong, whether the market has simply decided you don't matter.

This hits hardest where people are already vulnerable. The worker made redundant after fifteen years, suddenly competing with people half their age for roles that didn't exist when they started their career. The parent re-entering the workforce after a career break, facing an application process that treats gaps in employment as character defects. The graduate stepping into a market that demands three years of experience for an entry-level role and offers unpaid internships as a consolation prize.

These are not abstractions. These are your neighbours, your family members, the person sitting across from you on the train who looks fine but hasn't slept properly in weeks.

The people AI left behind

The modern hiring process was built on a specific assumption: that the ability to write persuasively about yourself correlates with the ability to do a job well. This has always been a questionable premise. Now it's an actively harmful one.

Consider the master electrician with twenty years of experience. She can diagnose a fault in a distribution board by listening to it. She's wired commercial buildings, trained apprentices, handled complex three-phase installations without incident. Her competence is profound and deeply practical. But her CV reads like it was written in a hurry, because it was. She's brilliant at her work and mediocre at writing about it, because these are entirely different skills.

Now she's competing with candidates whose applications have been polished by AI into gleaming, keyword-optimised documents that read like they were produced by a marketing agency. The playing field hasn't been levelled. It's been tilted further against her.

Or think about the care worker whose empathy and skill are evident in every interaction — the way she notices when a resident is withdrawing, the calm authority she brings to a crisis, the small acts of dignity she preserves for people in their most vulnerable moments. None of this translates to a two-page document. Her value is relational, embodied, present. The application form has no field for “makes people feel safe.”

These people aren't failing at their jobs. They're failing at a game that was never designed to assess what they're actually good at.

The bitter irony is that AI tools were supposed to help. They were meant to democratise access to better applications, to give everyone the same polished starting point. Instead, they've created a new literacy barrier. If you know how to prompt ChatGPT effectively, if you understand the conventions of a “good” application, if you're comfortable navigating the tools — you gain an edge. If you don't, you fall further behind.

The people who most need help presenting themselves are the people least likely to benefit from tools that require you to be articulate about your skills in order to generate an articulate description of your skills. The gap widens.

The employer side suffers too

It would be convenient to frame this as a problem that only affects job seekers, with employers sitting comfortably on the other side of the table. That's not what's happening.

Recruiters are drowning. The combination of higher application volumes and AI-polished uniformity has made their job significantly harder, not easier. Every CV looks competent. Every cover letter hits the right notes. The human signal — the quirky career path, the genuine enthusiasm, the unconventional background that might be exactly what the team needs — is being algorithmically smoothed away before it reaches a human reviewer.

Recruiters have been reduced to verification detectives. Instead of spending their time understanding candidates and assessing fit, they're trying to determine which applicants actually possess the skills their applications claim. It's forensic work masquerading as talent acquisition.

The numbers tell the story: • 70% of recruiters cite finding candidates with the right skills as their top challenge — not finding candidates, but finding the right ones in the noise • Average time to hire has stretched to between 4.9 and 8 weeks depending on the role • 60% of companies reported increased time-to-hire in 2024, despite unprecedented investment in hiring technology • Cost-per-hire has risen even as the tools supposedly making hiring more efficient have proliferated

And here's the outcome that should alarm everyone: employers are making hires that look good on paper but are profoundly misaligned in practice. The candidate who interviewed brilliantly but can't actually do the work. The perfect-on-paper hire who clashes with the team's culture within a month. The technically qualified person who has no interest in the problems the company is actually trying to solve.

This misalignment drives turnover, which drives more hiring, which drives more cost, which drives more pressure to hire quickly, which drives more misalignment. The cycle feeds itself.

The fundamental purpose of hiring — connecting the right person to the right role for mutual benefit and growth — is failing on both sides of the equation.

The trust deficit

Perhaps the most damaging consequence of all this is the collapse of trust.

Candidates don't believe anyone reads their applications. They're largely right. Employers don't believe candidates wrote their applications. They're increasingly right about that too. Both sides have entered the process assuming the other is operating in bad faith, or at least through a filter so thick that authenticity can't penetrate it.

Think about what's actually happening in a typical interaction now. A candidate uses AI to write an application for a role described in an AI-written job posting. An AI screening system decides whether the application proceeds. If the candidate gets through, they prepare for the interview using AI-generated practice questions based on the AI-written job description.

At what point in this process is a human being actually evaluated? At what point does the employer get a genuine sense of who this person is, how they think, what they care about? At what point does the candidate get a genuine sense of what the job actually involves, what the team is like, what success looks like in this specific context?

The answer, increasingly, is: not until it's too late. Not until someone has been hired and both sides discover the mismatch that the process was supposed to prevent.

This trust deficit has a compounding effect. When candidates assume the process is broken, they invest less authenticity in their applications — which makes the process more broken. When employers assume applications are AI-generated, they discount the content — which makes the application less useful as a signal. Each side's rational response to the other's behaviour makes the overall system worse.

We've arrived at a place where the hiring process actively obscures the information both sides need to make good decisions. It's not just inefficient. It's adversarial. Candidates and employers are nominally trying to find each other, but the machinery between them is optimised for volume, not connection.

Something has to give

The status quo isn't sustainable. Not economically — the costs are staggering. Not psychologically — the toll on people is real and mounting. Not practically — the misalignment between what the process measures and what actually matters is too wide to ignore.

But acknowledging the problem is only the first step. The harder question — and the more interesting one — is what a genuinely different approach would look like. Not a better version of the same broken system. Not another layer of AI on top of a process that's already drowning in AI. Something fundamentally different.

That's what we'll explore next.