Back to Insights

Are cyber jobs being stretched to breaking point by AI?

6/05/2026 Article

Don't get crushed... A woman stands between two walls that are closing in that are littered with an increasing amount of things to do.
BG

How much change can one tired person metabolise before we stop calling it resilience or strength and start calling it what it is? Because that seems to me like a big chunk of the AI conversation that we all keep politely stepping around.

I’m sure this applies to lots of industries, but I know that cyber security people are being told to adapt, upskill, experiment, automate, govern, prompt, evaluate, supervise, and somehow remain calm while the floor moves under their role. Again.

I sort-of get it too. Some of that advice is right. Hiding from AI is not a plan. Pretending it is a fad is not a strategy. Sitting under the desk with a packet of biscuits and a thousand-yard stare is emotionally understandable, but probably not a long-term winning tactic.

A lot of people were already stretched.

Cyber security already had a short half-life. A decent tester was already learning constantly. So was the incident responder, malware analyst, cloud engineer, detection engineer, AppSec consultant, OT specialist, and the poor security architect whose job is mostly translating reality into PowerPoint without crying.

All of these people were already carrying too much context, too many systems, too many alerts, too many meetings, too many half-funded controls, too much inherited risk, and too little recovery time. Now they are being told that the responsible thing to do is learn a fast-moving general-purpose technology that is also changing the shape of their job while they are still doing the job.

I don’t think that is a “normal” training requirement. Strikes me as more of like unpaid homework.

The market is not collapsing, but it sure is changing

The UK cyber security labour market still has depth. The UK Government’s Department for Science Innovation and Technology’s (DSIT) 2025 labour market research estimated roughly 143,000 people working in cyber security roles across the UK economy. It also reported 32,370 core cyber job postings across the UK. Apparently this is a 33% decrease from the previous year, and noted that demand for candidates with less than one year of experience fell from 25% in 2022 to 17% in 2024.

I take all these stats and figures with a bucket of salt.  In fact, when I first read the report I tore holes in it for logical inconsistencies and a variety of other mistakes.  But that said, these numbers still show a trend.  That trend isn’t towards the end of cyber security, but instead that there is so much uncertainty around that very few organisations are willing to risk expansion. My guess is that they are all hoping that the magical AI thing-o-matic will save the day.

The good news is that the industry still needs people who can reason about risk, understand systems, test assumptions, handle ambiguity, and explain consequences without turning everything into an interpretive dance about “business enablement”.

The bad news is that it may need fewer people doing some of the low-context manual tasks that used to form the bottom rung of the ladder: first-draft summaries, making “management porn”, generic vulnerability prose, alert enrichment, boilerplate remediation, basic configuration checking, routine evidence collation, and repetitive triage of incident tickets.

Tools usually compress tasks before they replace roles. When enough tasks inside a role are compressed, the role changes. When the role changes, the person in it either moves up the value chain, changes direction, or becomes easier to substitute.

This is not just a skills gap, it is a coping gap

The phrase “skills gap” doesn’t even get close to what is really going on.

Sometimes there is a genuine skills gap. Someone needs to learn a tool, a platform, a protocol, a cloud service, a testing method, a regulatory requirement. Fine. That is the job. Nobody becomes useful in cyber security by having a permanently frictionless afternoon.

A skills gap is a defined thing, its a combination of lots of traits that make an individual capable of delivering something that requires skill. But what AI is doing to our work model means it is impossible to focus on what the skills gap is so that we now need to move quickly enough to keep up with the ever changing situation.

The thing we have now, if it isn’t a skills gap, isn’t one new thing. It is a whole new layer appearing across many old things, and the layer keeps vibrating.

Humans do not adapt to infinite novelty for free. Yes, even though this sounds like a utopian dream for us ADHDers, we do eventually burn out. We all spend “tokens” of attention, sleep, confidence, time, and social bandwidth and these are finite resources. You can pretend they are not all you like. But you know as much as I do, that eventually reality comes round uninvited, knocks on the door with a clipboard and a stack of unintelligible paperwork, and stops you from leaving the house till you sacrifice enough spinning hard disks to your deity of choice.

So when someone says security professionals need to be proactive, I agree.

But proactive with what capacity? With whose time? Against which existing deadlines? At what cost to judgement, family, health, curiosity, and don’t forget, the basic ability to do deep-thinking and careful work?

The current narrative often sounds like this:

“AI is a huge shift. Nobody fully understands where it is going. It may reshape your role, your value, your team, and the junior staff around you. Please investigate it in your spare time whilst maintaining everything else. Oh and keep doing it for a few more years until hopefully, maybe, it all settles down a bit.”

And they say there is a mental health epidemic…

AI lands in an already tired profession

Cyber security has this weird cultural assumption that learning capacity is infinite. But it isn’t.

To be a good penetration tester, you were already expected to understand web application security, infrastructure, identity, operating systems, cloud platforms, exploit development, Active Directory, APIs, mobile apps, source review, reporting, scoping, client communication, tooling, legal constraints, and whatever deranged technology stack wandered into the next engagement wearing a lanyard.

If you work in IoT or OT, the learning burden is even bigger. You need to understand embedded Linux, firmware analysis, electronics, RF, industrial protocols, safety constraints, hardware interfaces, vendor-specific behaviour, and enough physical restraint to not treat a live environment like a really expensive CTF box.

AI therefore lands on people who were already adapting. Some of them were already close to the edge. ISC2’s 2025 workforce research reported that 48% of respondents felt exhausted from trying to stay current on cyber threats and emerging technologies, and 47% felt overwhelmed by workload.

A tired woman stares at a one of many computer screens in front of her which an overwhelming array of things demanding her attention. All the while it is sunny outside but she has not noticed.

The answer cannot be heroic adaptation

There is a version of this conversation that quietly demands heroism as if it were perfectly acceptable.

Be more technical. Be more strategic. Be better at automation. Understand AI. Govern AI. Use AI. Detect AI misuse. Review AI output. Reassure leadership. Mentor juniors. Maintain delivery. Keep your certifications current. Stay employable. Stay positive. Stay hydrated. Go to the gym more, Eat your five a day.  Floss daily. Be kinder to the world. Recycle everything. Have infinite patience. Don’t stand out from the crowd. Be novel and innovative and agile. Complete your astronaut training. Become a lighthouse.

If this was a personal relationship, you would call it gas lighting. Professionally, this is how we turn the people we have into plant fertiliser.

You do not need to become fluent in every AI tool, every model release, every agent framework, every vendor claim, and every breathless LinkedIn thread written by someone who discovered Python last Tuesday.

So if it isn’t heroic adaptation, what is it?  Yep, you guessed it, you also now have to be your own long-term career strategist. Sorry… I couldn’t help making fun of the situation. In all seriousness though, it probably is a good idea to spend a little bit of time each week thinking about which parts of your job can go.  Which bits actually have intrinsic value.  Which bits can be automated easily now and which bits you could look for opportunities to automate in the near future.

You need to work out three things:

  • Where AI can reduce load without reducing quality
  • Where AI creates risk, noise, or false confidence
  • Where your human judgement remains valuable

Choose the story you tell yourself

If your story is “I am falling behind because I cannot keep up with everything”, you will eventually start treating exhaustion as evidence of personal failure. A better story is: “The system is changing quickly, and my job is to make deliberate choices about where I adapt.”

You cannot track every model, every tool, every possible use case, every vendor platform, every AI safety argument, every automation pattern, and every labour-market signal. You cannot. Nobody serious can.

The people pretending otherwise are either selling something, have a vaneer of confidence, or have outsourced their sleep to a more junior member of staff.  Or just maybe, the AI’s are already walking amongst us.

I have increasingly been asking myself these questions:

  • What do I need to understand well enough to remain safe, useful, and honest in my role?
  • What can I ignore for now?
  • What can I test in a low-risk way?
  • What work am I doing repeatedly that a tool could help with?
  • What work must not be handed to a tool because the judgement is the point?
  • What specialist skills do I have / can I get, that will set me apart from the masses and the machines?
A woman climbs a ladder towards a brighter patch of sky. The ladder beneath her is being changed. It is not clear if she is being supported towards the top, or chased unwillingly away from the bottom.

Overloading staff is not the same thing as a Cyber Transformation Programme

If AI adoption only means already-stretched specialists are expected to deliver more, learn more, review more, and absorb more risk, then the organisation is myopic.

Is the goal of this “transformation” speed? Quality? Coverage? Consistency? Margin? Better training? Reduced burnout? Maintaining the status quo? Those are all very different goals. Pretending they are the same or pretending you can have them all is short sighted and shows a lack of strategic maturity.

We need to be asking harder questions like, what failure modes are we introducing? What is the long-term cost of using and relying on these tools? What are we choosing to not do right now?  What are we choosing to do in the future?  What do we need people to actually do in the future and what skills does that imply?

This is not glamorous and may well not win you friends or influence. It will not produce a viral screenshot of a 47-step autonomous workflow that allegedly replaces an entire department and definitely was not cherry-picked. But asking these awkward questions compound, and it preserves the thing cyber security depends on: functioning human judgement.

My prediction: The future favours skills and leverage, not exhaustion

 

AI will not remove the need for cyber security professionals.

It will expose passive careers, shallow work, weak evidence, brittle teams, and organisations that have been using individual goodwill as a control.

But it will also expose something more uncomfortable: the limit of how much change people can absorb when every structural shift is repackaged as personal responsibility.

So yes, be proactive.

But do not let “proactive” become a polite word for permanently available, permanently anxious, and permanently behind.

Use AI where it reduces low-value load. Challenge it where it creates noise. Learn enough to test it, govern it, and call out nonsense. Deepen expertise where real-world context matters. Protect your personal resources like they are part of the security architecture, because they kinda are.

And tell yourself a better story than “I must keep up with everything”.

 

During a storm a woman places a boundary marker showing that humans have rights too.

Get in touch