Back to Insights

And the winner is …. ‘lowest compliance effort’

If it connects to the internet, the compliance label won’t save you. Especially when if it doesn’t encourage vulnerability disclosure on all embedded products

5/02/2026 Podcast

A tacky trophy on a plinth. The trophy looks cheap, like bulk-order acrylic or hollow gold plastic. The plaque on the plinth reads: “And the winner is… LOWEST COMPLIANCE EFFORT”
BG

The internet-connected world isn’t neatly divided into “consumer,” “enterprise,” and “industrial.” In practice, devices drift across categories constantly—and that’s one of the reasons IoT security is still playing catch-up.

In a recent conversation on You Gotta Hack That, Felix spoke with David Rogers of Copper Horse about the latest edition of their annual report, The State of Vulnerability Disclosure Usage in Global Consumer IoT. The report focuses mainly on consumer products, but David makes a compelling point: the moment we rely on categories too much, we create loopholes—exactly the kind of cracks insecurity loves to slip through.

The “grey area” problem: it’s all chips, code, and connectivity

Some of the messiest examples show up in health tech. Apps and devices can stray into “medical” territory without being regulated as medical devices—because, frankly, regulation is expensive and restrictive. But from a security perspective, the components don’t care what label is printed on the box. As David puts it: there’s hardware, software, and an internet connection. That’s the core.

The same drift happens in the real world of businesses. A small company might buy a consumer printer from a retail store and put it into an office environment. A factory might run serious operational technology on the floor—but also have “ordinary” IT systems, consumer-grade devices, or even improvised internal builds that become the weak link. In other words: segmentation by category can be tidy on paper, but chaotic in reality.

When categories become a compliance escape hatch

David highlights a subtle but important risk: definitions can give organisations an “easy compliance option.” If the rules vary by category, some manufacturers will inevitably aim for the lowest-bar interpretation—especially if it reduces cost and paperwork.

This becomes more than a philosophical issue when you’re thinking about vulnerability disclosure. If a company doesn’t provide a clear, dedicated way for researchers to report security issues, vulnerabilities can linger, users stay exposed, and the internet gets noisier and riskier for everyone.

The CRA: ambitious, but vulnerable to “tick-box” security

The EU’s Cyber Resilience Act (CRA) aims to raise the security baseline across connected products. The intent is positive. The concern—based on David’s reading and related standards work—is that the implementation may be messy in places, especially around vulnerability disclosure requirements.

One example: instead of requiring a dedicated security contact point (the thing researchers actually need), drafts may allow a generic customer service page. That’s a recipe for reports being lost, delayed, or mishandled. David also flags a more serious issue: mixing up two distinct ideas—reporting vulnerabilities to authorities versus coordinated vulnerability disclosure with the person who found the bug. When experts start confusing those concepts, it’s a sign the wording may be too ambiguous.

The risk is a familiar one: security teams buried in compliance spreadsheets, self-certification allowing weak products through, and testing capacity turning into a bottleneck—while the end goal (secure products) gets diluted into “technically compliant products.”

A realistic hope: progress, pressure, and a two-tier future

Despite the worries, the conversation isn’t doom-and-gloom. David’s team is optimistic about some fundamentals improving: better security building blocks are widely available, common vulnerabilities are being found and fixed constantly, and defensive capability is increasing—especially as AI changes how vulnerabilities are discovered.

But a two-tier world may emerge: serious manufacturers raising the bar, while low-quality “rubbish” devices persist. And the consequences don’t stay local. Cheaper insecure products often land hardest in poorer countries—fueling fraud, harm, and, ultimately, creating a global pool of compromised devices that can be used to attack anyone.

Want to explore the data yourself? Copper Horse publishes the dataset as open data under a CC 4.0 licence—so you can reuse it with attribution. Check the report via the IoT Security Foundation and Copper Horse websites, and if you’re building, buying, or regulating connected products, let’s talk about what “good” looks like in practice.

Transcript

Felix (00:02)
Hello and welcome to You Gotta Hack That, the show all about the cybersecurity behind the Internet of Things and operational technology. Today, I’m continuing my discussion with David Rogers from Copper Horse. He’s just published the eighth edition of their annual report entitled, The State of Vulnerability Disclosure Usage in Global Consumer IoT.

In our previous episode, David told us that a staggeringly high number of IoT manufacturers had no way for security researchers to report vulnerabilities. Whilst things have improved since the report’s first edition, there is still a long way to go. We also looked into how this report goes a step further and explores how IoT manufacturers and retailers both have responsibility to drive up cybersecurity standards to protect us all.

To carry on our conversation, I asked David if his report covers just consumer IoT devices, or if it covers industrial IoT, medical IoT, or whether it’s a bit more of a grey area.

David (00:56)
Yeah, it is grey. The easiest grey area would be things like health apps that stray into the medical area. Things like period apps for women. They get it wrong and then they get pregnant at the wrong time. You know, these things have happened. Or even heart rate monitoring, things like that. And I think it’s in the interests of some of these companies to make sure those things are not defined as medical devices, because that means regulation.

But what I always say to everybody is, underpinning all of this is a PCB with chips on it and then there’s some software running on it. That’s it. It connects to the internet. And it’s really about how that’s built properly or not built properly.

But there’s a lot of commonalities there. You can only build products in so many ways. There’s only so many usable technologies out there. So you see a lot of commonality. I’ve worked on OT in the past. You see the same techniques and the same problems. And you also see a lot of these devices in different spaces.

We do have a section, actually, on what would be called enterprise IoT. But having thought about this stuff for a very long time, I feel like defining between product categories often gives companies a get out of jail free card, because then they’ll sort of bid down to the lowest level category.

Felix (02:23)
Easy compliance option, basically.

David (02:25)
Yeah, exactly. And maybe I’ve been a little bit unfair there, but small businesses are going to go into Currys and they’re going to go and buy a printer, which is aimed at consumers. That’s just reality, right?

Yeah. Yeah. And then, if you take, straying outside of our report, but it’s something we’ve thought about, is, okay, you’ve got a factory, loads of OT stuff on the factory floor, but then you’ve got a bunch of other stuff as well. Maybe some homebrew Raspberry Pis or something, that does happen, you know, self-developed internally. But then you’ve got a bunch of IT equipment. You’ve got consumer grade printers on the factory floor. That happens, right? Yeah. You’ve got back offices doing personnel and finance.

And actually those have often been the ways in, as we found out with some of these third party providers in the past few years. They seem to be the weakest link, but they’re providing back office services for printing invoices or for doing payslips and stuff.

So when it comes to securing the business, and then you’re sort of picking out bits of it and going, well, all the enterprise IoT in your business should be X and be compliant, and then all the consumer IoT, I feel like dividing those segments like that is a bit of a fool’s errand and it’s just arbitrary lines. I mean, realistically, a lot of the arbitrary lines between security governance or security regulation policy in governments is by government department. You know, transport looks after cars and trucks and stuff. So the same people will work on that, and then other departments will work on other things, you know.

So, I mean, we’re quite lucky in the UK that we’ve got the NCSC and they’re sort of the glue that glues all the government departments together, and some very smart people there. I think having those sort of what would be classed as horizontal standards, or horizontal expectations requirements, are good because it eliminates some duplication.

Where it does come a little bit unstuck is where you’re trying to deal with a safety critical OT product, so a factory line product, and trying to apply the same consumer grade set of requirements against something that’s got interlocks and so on. So I see clashes there. Yeah. And also even when you start to talk about the user, well, the user’s an entirely different person. It might be a skilled maintenance technician, not just Joe Bloggs.

Felix (04:51)
Yeah, I suppose there’s no taking into account how you use it, which is essentially that consumer printer in an office environment corollary, but taken further, isn’t it? Because the kind of heart monitor type devices, if you’re using it because you’re trying to get fit, that’s one thing, but if you’re using it because you’ve got heart disease, that’s a completely different use case and actually could be much more critical.

The same applies there to OT versus IoT conceptually, but less well-defined, I suppose.

David (05:17)
Yeah, yeah. And so I sort of touch on this a little bit in this report. So to answer your original question, we’re only concentrating on consumer, and we’ve a little bit of enterprise in there. There’s, boy, there were enough consumer devices to look at. You know, crazy things. You know, stuff that you never thought would be invented, and then by two reports later, surprise, surprise, the company’s died because nobody wants to buy it. Oh, brutal. But you can have a look.

But then other sectors like automotive and stuff, I suspect are pretty bad as well. If we even took other segments, say just connected agriculture or just farm equipment, that would be quite interesting. It’s something you’d like to do in the future, just to get a full understanding of what’s going on in those sectors.

The CRA, which is the EU legislation, which is now in place, the regulations coming in broadly from 27, there’s some bits next year. It covers all connected products. And from what I’ve seen, I feel like they’ve bitten off more than they can chew. And when they’re standardising it, they’re trying to almost micromanage the use cases and the products. And that was actually the opposite of what I’d done with the UK’s Code of Practice for Consumer IoT, where we’d looked at an outcome focused principles and requirements list. And we try to help foster innovation and to not constrain the product design, because that could lead to greater insecurity.

And I’m really worried right now that the CRA, having looked at the vulnerability disclosure bits, the wording is all over the place. They stumble. They use the same types of words for stuff. There’s a lot of get out of jail free cards for manufacturers.

Let’s just take the contact point. So we would expect that you have a dedicated page for security researchers because it’s super important. It gets the right person. That’s the basics of it. Yeah. Well, they’ve now got a requirement in there that says you can just have a customer service page. Now that is as far as possible as you can get for vulnerability disclosure. In the CEN-CENELEC draft standard, that’s what it says. But what they’ve also done is made a fatal error.

Felix (07:22)
Not much CVE either, right? Disclosure.

David (07:41)
Where they stumbled over themselves. So the EU Commission has this requirement for vulnerability reporting to sort of national authorities and to ENISA. I don’t want to go into the ins and outs of that, but they’ve mixed up vulnerability reporting and the reporter of a vulnerability as part of coordinated vulnerability disclosure in the same spec. And I’ve seen experts in this area mix the two up already, and that is really bad. It’s like such a known gaffe that I’m a bit worried.

And I haven’t seen the technical specs on the other requirements, but anecdotally, I’m hearing that it’s a bit messy, and this is not good for us as European citizens because what’s going to happen is a load of insecure stuff is going to slip through. There’ll be bottlenecks for the testing houses. All this lower end stuff, that’s just going to go, they already have done. There’s a basic self-certification, right? So a lot of stuff’s going to get through. And then for the higher end stuff, well, it’s going to be a compliance nightmare.

It’s very much heading towards tick box compliance rather than what we want, which is secure products. Yeah. And it’s going to drag security teams, product security teams, into more spreadsheets. And God knows there’s enough compliance to be done against other legislation and regulation that I fear that it’s not going to be a good couple of years for people.

Felix (09:17)
No, that’s not good news at all if that trend does apply across the board. I didn’t realise that that was likely to be happening at all. Some of these things feel like they’re subjects that we’ve put to bed already once. We seem to be doing it again. Is that kind of your take?

David (09:32)
Yeah, I don’t understand that. I mean, my own take on it is it feels like too many cooks in the kitchen spoil the broth. And I think this definitely feels to be the case here. You seem to have conflicts as well between people with backgrounds in the OT world versus, it seems like very few security researchers, if any, have been involved in this work.

The draft vulnerability disclosure standard means something. It’s not easy. I don’t want to sort of throw them under the bus too much. There’s some difficult topics here. And of course they’re looking to the future. So they want to perhaps be more demanding and be more structured about, for example, how you connect the software bill of materials of a product, and the hardware bill of materials, to a vulnerability that security researchers discovered. I get that, but the point at which we’re coming from is that the very basics aren’t being done.

So I think it’s a real stretch to expect these companies to even know what an SBOM is or to get any of it right. Yeah, we’ll have to see.

Felix (10:44)
For 2026, the You Gotta Hack That team has two training courses. On March the 2nd, we start this year’s PCB and electronics reverse engineering course. We get hands-on with an embedded device and expose all of its hardware secrets, covering topics like defeating defensive PCB design, chip-to-chip communications, chip-off attacks, and the reverse engineering process.

On June the 8th, we launch the Unusual Radio Frequency Penetration Testing course. We dig into practical RF skills so that you can take a target signal and perform attacks against it in a safe and useful way.

Both courses are a week long. They are a deep dive. They’re nerdy. And we provide everything you need other than your enthusiasm. As the Unusual RF Penetration Testing course is brand new, you can be one of our beta testers and get £1,000 off. There’s more information available on our website at yougottahackthat.com/courses, and we recommend booking straight away as we have to limit the spaces to ensure the best learning experience.

But for now, let’s get back to today’s topic.

David (11:40)
[inaudible]

Felix (11:43)
I’m sort of hearing that this is potentially such a mammoth piece of legislation, and the progress that is clearly being made in terms of that improvement from, forgive me, is it 9% to about 40% or so coverage. These things all sound quite positive on the surface of it, but actually there’s potential here for us to be tying ourselves in knots. Maybe even the industry might be pacing itself a little bit in terms of the number of new products and companies that are now coming up, versus actually doing things intrinsically correctly in the first place. Is that fair, or is that too much of a statement?

David (12:19)
Yeah, yeah. So I think my conclusion to the report, and it wasn’t just me, it was a team of people, four of us working on it this year, that actually is quite optimistic. So that retailer figure, for example, is quite optimistic. It’s not so good on the online marketplaces, but in all of these 14, 15 years on this topic alone, obviously the technology has been moving forward. And the PCBs and chipsets and everything that give you the foundational elements for hardware security, they’re all available and relatively easy to adopt. And we had all that good practice out there for years.

So I feel like we have definitely moved forward, and there’s a lot that’s there for free for developers now. So I would like to think that some of the fundamental principles like don’t use default passwords are, on the whole, being solved, and product security on the whole is getting better. But obviously, attackers don’t stand still either.

I would say we’re also seeing, from a positive angle, a lot of vulnerabilities being flushed out every single day out of common operating systems, Linux-based systems, and so on. Every single time that happens, the attack surface reduces.

And we’re entering a really new era right now with AI-assisted vulnerability discovery, maybe some more interesting types of attacks for the future. But we can also use those on the defensive side as well. So what we might see is a kind of like two tier system where you’ve got the major manufacturers that really know what they’re doing, and then you’ve just got all this rubbish out there that still persists. And then eventually somebody has to decide what to do with it.

I get worried that poorer countries in the world, where people are on a lot less money a day, they’re going to be the victims of all of these products that are just imported into those countries and sold really cheaply and have no kind of standards for security. Those people would be victims of lots of fraud and stuff like that. The flip side of that is also that those products can again be targeted against the richer countries and so on. It’s not like the West becomes immune, for example, to attack just because they pass the CRA. The internet is totally open.

Felix (14:48)
It’s interesting because you’re almost arming other areas of the world. It’s as simple as that. If you end up just with a population density of weaker devices, then all you’re doing is giving attackers tools, but just placing them somewhere else.

We’ve clearly got loads we can dig into more, David. Maybe there’s another recording you need to do some time in the future, once your team’s report has really taken off and everyone’s heard about it. I’m keen to hear if you’ve got any further thoughts, and I suppose in particular about the future and where this goes. It would be lovely to hear some rosy stuff.

David (15:27)
Coming up. Well, okay. We’re on report number eight. So the obvious thing is to carry on to report 10, isn’t it? A nice round number. But in all seriousness, so that would take us to 2027, which is when the CRA compliance should be in place. So that will be a really interesting year to see where we are at that point. So yeah, we’ll at least carry on for another couple of years, I think.

All of this data, by the way, we make it open data. So it’s under a CC 4.0 licence, which means you can do whatever you want with it, as long as you attribute it to us. And a lot of people have. And also you can check our homework if you want. If we got something wrong, we’re always open to feedback. So I think we’re big fans of transparency. You can see the data, you can draw your own conclusions, and we’ll continue to do that.

Felix (16:24)
Where would anybody find that particular report then, David, and your data?

David (16:27)
Okay, so two places. You can go to the IoT Security Foundation website, iotsecurityfoundation.org, and it’s on there, and also on the Copper Horse website, copperhorse.co.uk.

Felix (16:42)
All right, lovely. I’ll make sure I put a link to those in the show notes. And so thank you very much for joining me today, David. It’s been a pleasure chatting to you. As always, I’m looking forward to our next interesting project to do.

David (16:53)
Thanks Felix, and thanks for having me on. It’s been great to chat.

Felix (16:57)
Thank you everybody for listening today. I hope you have enjoyed the show. An interesting and insightful report, there’s clearly loads more to cover. In the meantime though, please subscribe and give us 5 star reviews, and maybe even get in touch with us and ask for a particular topic to be covered.

Don’t forget you can find us on social media, things like LinkedIn. If you search for You Gotta Hack That, you’ll find us.

Get in touch