Weaponized Before Disclosure: Rethinking Vulnerability Intelligence for Embedded Systems

June 26, 2025
 

Threat actors aren’t waiting for public CVE announcements—they’re exploiting vulnerabilities long before organizations even know they exist. In this episode of Exploited: The Cyber Truth, Patrick Garrity, Security Researcher at VulnCheck, joins Joe Saunders, CEO of RunSafe Security, to explain how attackers are moving faster than ever—and why embedded systems are especially vulnerable.

The conversation unpacks the reality of “weaponization before disclosure” and how legacy hardware, open source components, and supply chain complexity all contribute to visibility gaps. With embedded devices often built from third-party components and maintained over decades, organizations can’t afford to rely on reactive patching or guesswork.

You’ll also hear how SBOMs, build-time insights, and Secure by Design development practices can help teams stay ahead of threats and why transparency, due diligence, and security maturity are no longer optional.

Speakers: 

Paul Ducklin: Paul Ducklin is a computer scientist who has been in cybersecurity since the early days of computer viruses, always at the pointy end, variously working as a specialist programmer, malware reverse-engineer, threat researcher, public speaker, and community educator.

His special skill is explaining even the most complex technical matters in plain English, blasting through the smoke-and-mirror hype that often surrounds cybersecurity topics, and  helping all of us to raise the bar collectively against cyberattackers.

LinkedIn 


Joe Saunders:
Joe Saunders is the founder and CEO of RunSafe Security, a pioneer in cyberhardening technology for embedded systems and industrial control systems, currently leading a team of former U.S. government cybersecurity specialists with deep knowledge of how attackers operate. With 25 years of experience in national security and cybersecurity, Joe aims to transform the field by challenging outdated assumptions and disrupting hacker economics. He has built and scaled technology for both private and public sector security needs. Joe has advised and supported multiple security companies, including Kaprica Security, Sovereign Intelligence, Distil Networks, and Analyze Corp. He founded Children’s Voice International, a non-profit aiding displaced, abandoned, and trafficked children.

LinkedIn


Guest Speaker – Patrick Garrity, Security Researcher at VulnCheck

Patrick Garrity is a security researcher at VulnCheck where he focuses on vulnerabilities, vulnerability exploitation, and threat actors. He is a seasoned cybersecurity professional with over 15 years of experience across solutions engineering, product and security research roles helping build and scale security startups including Duo Security, Censys, Blumira, Nucleus Security, and VulnCheck.

LinkedIn

 

Key topics discussed: 

  • Why exploitation often occurs before vulnerabilities are publicly disclosed
  • The hidden risks in embedded systems and opaque supply chains
  • How SBOMs and build-time visibility can help mitigate inherited risk
  • The importance of security maturity in long-lifecycle product environments
  • What organizations can do today to reduce risk and increase resilience
Episode Transcript

Exploited: The Cyber Truth,  a podcast by RunSafe Security. 

[Paul] Welcome back, everybody, to Exploited: The Cyber Truth. I am Paul Ducklin, joined by Joe Saunders, CEO and founder of RunSafe Security. 

[Paul] Hello, Joe. Welcome back.

[Joe] Hey, Paul. Thank you. 

[Paul] And our very special guest in this episode, Patrick Garrity of VulnCheck. Hello, Patrick. 

[Patrick] Hey. Thanks for having me. 

[Paul] You’re waving on the video with one hand. You damaged the other one in a skateboarding incident. Very exciting. Now our topic this week is weaponization before disclosure, which I guess is a fancy, almost militaristic way of talking about the problem of zero day security holes.

[Paul] And in particular, how 0 days can affect embedded systems. To set the scene on all of that, Patrick, why don’t you tell our listeners how you got into cybersecurity research as well as what you do at VulnCheck and, for that matter, why it is important? 

[Patrick] I’ve been in cybersecurity helping build companies over the last twelve years. Originally started in, working at a managed service provider. I think it was 2021 when we started to see Log4j and some of the exchange vulnerabilities there.

[Paul] Joe, it’s come up again. It’s the one that nobody in cybersecurity can ever forget. It was just so troublesome because nobody knew where it was in any Polish organization, let alone their own. 

[Patrick] Yeah. And I think the exchange vulnerabilities in 2021, I was working more on the detection and response side. And, naturally, you saw people moving from credential compromise, which is still going on, and exploitation, obviously, over the last couple years has risen significantly and became a really top tool in the attacker’s tool belt. 

[Patrick] Yeah. I saw an opportunity to work in the VM space, and everyone laughed at me at the time because they’re like, who cares about vulnerability management? Now we’re here several years later just generally studying and understanding, like, what vulnerabilities are, why are they being exploited, which technologies are targeted and used. So I spent a lot of time researching and studying those and then working on different data feeds and data sources to understand the credibility of them.

[Patrick] Are they reliable? How can we deliver that information timely to our customers? 

[Paul] For the most part, when we get that list of however many vulnerabilities were patched in the latest Apple update or the latest Microsoft patch Tuesday, we kinda get the idea, oh, good. The patches are out. As long as I deploy them, I’ll be ahead of the crooks, or the state sponsored actables or whatever they are.

[Paul] But the whole problem with a so-called zero day attack is it gets its name because there were zero days during which even the most proactive sysadmin could have patched ahead of time. So these are where the bad guys found and exploited a security hole first and deliberately kept it quiet until someone just happened to notice. 

[Patrick] And attackers are moving fast. They’re proactive. And often, we see disclosure of exploitation on the security advisory.

[Patrick] So, obviously, something’s going on before the patch is issued if there’s already exploitation evidence. 

[Paul] You know, perhaps what if there is such a thing as a more commonplace zero day attack, a weaponization before disclosure, what sort of time frames are we looking at? Because often not just a few days, is it? It might be weeks or even months.

[Patrick] Yeah. 

[Paul] That the bad guys are able to exploit a hole before anyone even realizes that something untoward is going on.

[Patrick] Well, yeah. I mean, the first timeline I would look at, if we look at all the vulnerabilities that are disclosed with exploitation evidence this year, 2025, we’re almost up to 300. I think we’re maybe at, like, 280 to 290, vulnerabilities, that have a CVE with exploitation evidence. About a third of those vulnerabilities have exploitation evidence on the day this or before the CVE was published. So what that tells me is, attackers are moving fast.

[Patrick] They’re very opportunistic, discovering vulnerabilities and a multitude of different products and acting quickly. You know, a lot of times, it could take days, months, years even before someone discovers a vulnerability exists or is being exploited, we still see even older vulnerabilities that are exploited for the first time. So there’s a factor of detection that goes on there as well. Does the organization, the product have a capability to detect exploitation? Most don’t.

[Patrick] If a third a third of the ones that are determined to be exploited in the wild are in fact known to be exploited on the day the vulnerability is being issued, there’s probably a lot more that we just don’t know about is the the real reality. 

[Paul] Joe, maybe I can ask you at this point, thinking back to Patrick mentioning Log 4j, when you get some notification in the world that x y zed product or x y zed code or x y zed device has some kind of security hole in it, that information is potentially useful, but only if you know whether you have one of those anywhere in your organization. So how do you go about finding out what you’ve got and where, particularly in today’s environments where you’ve got a mixture of IT systems that are interconnected to your embedded devices and your OT systems, all in this giant interacting network. Where do you start? 

[Joe] Well, as you say, certainly, there could be vulnerabilities found in, let’s say, some open source software, and it ends up being the responsibility of the the vendor who supplies the overall product to acknowledge that that component is in their software in order to say that software has that vulnerability contained within it.

 [Joe] And so if it’s not the vendor issuing the disclosure and like in Log4j, it was found to be in an open source component, some folks, the end users, the asset owners, people who manage the technology that gets deployed in infrastructure, they won’t necessarily know if they have that underlying component in the software to begin with as you suggest. And so I think that’s part of where the trouble comes with communication between those who are deploying technology and those who are producing technology. And, of course, that’s where Software Bill of Materials can come into play. That’s where having good disclosure systems and processes by the vendors who produce that technology comes into play. Because you can imagine there’s a lot of communication that needs to go back and forth between organizations, and there’s a lot of discovery to find out if in fact you do have that underlying component.

[Joe] And so from a Software Bill of Materials perspective, there’s a couple different challenges. Some folks in the embedded space will generate an SBOM from a binary, which may not get to all the underlying transitive dependencies in the software. 

[Paul] Yes. Because they’re just looking for strings that might be copyright messages that are thought to be unique to a particular application. But it won’t necessarily tell you about the 10 or 30 or 500 other software components that got sucked in at the time that the product was actually built.

[Joe] Right. So you could rely on heuristics to assume what packages and libraries are in the software. But for embedded systems, knowing exactly what goes into the binary, as you say, at build time gives you a complete view and a complete view of the dependency tree and the transitive dependencies. It’s a tricky problem. That’s why we have, we pay a lot of money for security researchers to look for things, and we hope they can find them.

[Joe] But there’s other things I think organizations are starting to do, and, certainly, SBOM can help when in fact it’s produced, to provide that complete and accurate record of what goes into the software to begin with. 

[Paul] So, Joe, just to explain for our listeners the idea of a transitive dependency, don’t panic if you can’t remember transitivity, etcetera, from your high school mathematics. Loosely speaking, the idea is that if a depends on b and it just so happens that b depends on c, then, transitively, a depends on c whether you like it or not. And so if you only know about the first step in that chain, then you may not realize, as you say, that the product you’re using actually depends on a whole web of other stuff in a giant source code tree beneath it. As the user of the product, as the purchaser of, say, a medical device, you might be able to get some idea of what’s in a product, but the duty of care really lies with the person who creates the device in the first place, doesn’t it?

[Joe] Yeah. A 100%. And if you gather the information at build time, then you have some visibility into those third party libraries and other things that get pulled in that ultimately comprise those binaries. 

[Paul] And as you’ve said in previous podcasts, the closer to the build time that you measure the bill of materials, the more likely it is to be accurate. So if you just take a list of all the software components that you have in the cupboard at home, then you may overestimate what’s in there and cause people to waste time looking for things that aren’t there.

[Paul] But if you just rely on tasting the cake after it’s been baked and going, yeah, it probably got cumin in there, you might miss some of the more subtle ingredients. 

[Joe] And guess what, though, Paul? As we talk about some of those subtle ingredients, going back to Patrick’s point, those who are administering exploits and doing the research are well funded, and they might be finding those weaknesses even when the people who have deployed software don’t know they’re there.

[Paul] And if they’re using cumin, they might be using it to hide the taste of arsenic so that you don’t realize that that’s there as well. 

[Patrick] On the software development side, you know, I have a lot of friends in the medical device space in critical infrastructure.

[Patrick] And the reality is you have amazing engineers building a product to do a thing. They do the thing that needs to get done, whether it’s detecting some variants of cancer or whether it’s an operating machine or whether it’s making sure your lighting system is functional. The challenge generally I see is most of the people that build these things have no experience in security. They’re literally just assembling whatever package they can for the thing to do the outcome, which is its intention. And you literally end up then going, oh, well, we need to update this thing, so we need to connect it to the Internet.

[Patrick] Like, that’s the next logical thing in doing development or we wanna collect analytics. And, unfortunately, you end up with these devices on the market for long periods of time that end up being Internet connected that have no security developed from the start because the intention was to get out a product that solves the problem, not to make it secure. And so, you know, I think that’s a lot of the reason why we’re generally in the place that we’re in as it relates to embedded systems and the challenges with open source software dependencies and all all the vulnerabilities that are associated with that. 

[Paul] Now, Patrick, I don’t want to point fingers at any particular company here, but inadvertently, I’m going to do it. I spy with my little eye something beginning with I.

[Paul] They’re not alone in this. 

[Patrick] Yeah. 

[Paul] Lots of mainstream security device vendors in the past few years have been caught out. I won’t list them all, but think of a big vendor. They’ve all had vulnerabilities, not just in their regular products, but ironically, in the very products that are supposed to help you protect your network by providing some kind of perimeter defense.

[Paul] So you can take those potentially insecure embedded devices, put them behind a firewall, or put them inside, as Joe likes to say, the medieval castle, bring down the portcullis and everything’s safe, except that the portcullis turns out to have a bug in it. 

[Patrick] Yeah. 

[Paul] That allows people to wander in. The company beginning with I, I’m talking about Ivanti, obviously, they’ve had some quite spectacular security device bugs, haven’t they, that have been zero days? 

[Patrick] Yeah. It’s a real challenge.

[Patrick] Yeah. You look at some of these product lines. And I do think in that example, you essentially have what I would consider to be I think it’s PE or, like, holding company. Going and buying up products that have a tremendous amount of deployment, very pervasive across many different industries. And, unfortunately, these products have changed hands multiple times.

[Patrick] I think a lot of times we see that the engineers are probably no longer with it that originally built it. You might have turned over engineering teams multiple times. Ultimately, those products being decades old, unfortunately, probably haven’t been maintained to the extent that they should have been. So, ultimately, the edge has become, especially in the last couple of years, the primer you know, one of the primary targets to get into an environment, SSL VPNs being loved but other products as well. 

[Paul] And the irony is that that is the very product that is not only supposed to secure you and let only the good guys in and keep the bad guys out.

[Paul] Probably got more insight even into your super secret confidential traffic than any other part of your network. 

[Patrick] Yep. Ivanti got caught by surprise, which shouldn’t have been a surprise, but it seemed like they didn’t even know their own code base in the devices. And I still think that there is systemic problems in some of these very, very legacy products. 

[Paul] So, Joe, in this case, it sounds like a very special example of not knowing your own Software Bill of Materials.

[Paul] If the response is, oh, that was some open source thing we had that we didn’t even realize was in the product. Yet if you’re selling it, you really ought to know that, shouldn’t you? If you care about what we’ve referred to in the past as Secure by Design, you aim to make it secure rather than you wait for problems to be found and then try and paper over the cracks later. 

[Joe] Yeah. Absolutely. I’ll point out there’s other dimensions. Patrick talked about detection on devices and looking, you know, even for anomalous behavior. And some of those things can be software crashes that are unintended and look like bugs, but in fact might be malicious activity causing a crash on a device. And so I do think a form of monitoring and even monitoring for software crashes is a good practice for folks that are managing these devices.

[Paul] Yes. 

[Joe] And also building in security or adding security onto those devices, especially when nation states are involved. And if there was a country that begins with “C” involved in an advanced exploitation, you can assume that they’re well funded and looking for collecting information to advance intelligence objectives and other things. There’s a lot here, and there’s a lot of well-funded adversaries and nefarious actors. What it means is there’s a whole slew of approaches and techniques to increase the resilience of your devices.

[Paul] So, Patrick, what can software and hardware companies that aren’t really innovators themselves, but are really, as you say, holding companies that buy up popular products and build them into a suite. What can they do to make sure that they are not just living with the sins of the past? 

[Patrick] I think a lot more due diligence is one of them. And this applies to the OT space, which is very common as well. I think there’s generally a lack of due diligence on the product security side to understand if there is good discipline, what does the code base look like, do you have SBOMs, what vulnerabilities exist, do you have good discipline around vulnerability disclosure, are there CVEs assigned to these products? That to me is a sign of maturity.

[Paul] Right. Because vulnerability disclosure isn’t just about not being the kind of company that sweeps software bugs under the carpet and pretends they aren’t there. 

[Paul] Yep. A problem with vulnerability disclosure could indicate that you’re just not doing the security research strongly enough so that you just don’t realize that there’s anything that needs sweeping in the first place. That’s a very bad position to be in, isn’t it? Almost worse. 

[Patrick] If you’re on the investment side, I went through an acquisition with Duo. We got acquired by Cisco, and the amount of due diligence they did on the security side was pretty significant. A big part of the process in that acquisition was ensuring they were acquiring something that was gonna be secure for their customers. And I think people are recognizing that probably more, but people get super excited about monetization as it relates to products and acquiring new lines and acquiring existing revenue streams, and they overlook the security components very frequently.

[Patrick] And so I think that’s a pretty big opportunity. Similarly, like, if you’re a consumer buying a product, you should probably think the same way. What practices does this organization have in place? Do they have a vulnerability disclosure process? What does that process look like?

[Patrick] Are there CVE assigned to the vulnerabilities? Naturally, that’s gonna show you an aspect of maturity. Just because a CVE exists or is being exploited doesn’t mean the software is necessarily bad. I think that can be also a signal that there’s some level of maturity from a product perspective. Generally speaking, if you’re gonna be spending money with a product company, often in negotiation, you can ask to get broader visibility when you’re buying a product is the best time to negotiate the transparency and the information you’re gonna receive from that customer.

[Paul] So it’s not all about the amazing medical imaging and the fantastic super low power valve control. It’s also about how safe is this going to be in a week, a month, a year, a decade, perhaps even longer. 

[Patrick] Yeah. 

[Paul] Joe, there’s no reason why a well designed embedded device that is not a general purpose computer shouldn’t be able to be capitalized over decades. Is there?

[Joe] Yeah. And you want that device to remain safe over that period of time and last a long time so that those that are buying those components and those gadgets and those devices know that they’re gonna be around for a long time because it is what makes infrastructure operate. With that, I do think, as Patrick says, asking those questions upfront, what are your practices? How do you respond to incidents? How do you patch software?

[Joe] And what other technology or what other security you build in to ensure that you can prevent exploitation in the first place. And so I think all of those things are fair questions. And as we’ve talked about and what Patrick’s getting at is the notion of the discipline of Secure by Demand. 

[Paul] So that means asking the right questions, not because you’re trying to be difficult, not because you just want the price to drop. You actually want to buy a product that is safe now, will probably be promptly fixed if there are problems in the future, and that will last you, particularly in an embedded device, not just for the three years of a typical laptop, but maybe for ten, fifteen, twenty, twenty five years, like a pump room control system.

[Joe] Exactly right. 

[Paul] Lots of people go, well, have you got all the check boxes? And, Joe, you have some very strong feelings about compliance, don’t you? That, obviously, there are places like the automotive industry where it’s absolutely required that you have to meet the relevant standards or else you can’t sell your product. But getting that compliance is not an end in itself.

[Paul] It should be actually part of your culture, almost like the cybersecurity spirit in your organization, that you achieve compliance because you are capable of doing so, not simply because you have to. 

[Joe] Yeah. 100% right. And I think the kinds of questions and the discipline that Patrick was suggesting organizations could have really represents what the overall maturity of their software development process is, not just their security program. And those two things go hand in hand.

[Joe] And when it comes to safety in these industries, let’s say, autonomous driving or safety of flight or even in industrial  automation facilities, there is a significant consequence. And so there is always then risk that something could go wrong. And with that, if you’re simply trying to check the boxes for compliance, you might be missing an opportunity. And the opportunity is to embrace more mature software development practices, incorporate security into it, and have a more complete process so that your software has higher quality, fewer bugs, and most likely then a lower risk of exploitation. Or if there are vulnerabilities discovered, a faster time in mitigating or resolving those vulnerabilities.

[Patrick] So this is a whole new different way of looking at that famous, what is it, embrace, extend, extinguish metaphor. Embrace cybersecurity, extend it through your entire software development life cycle, and extinguish bugs before they get into the software or into the field rather than rely almost entirely on bodging over them afterwards when somebody who could be a state sponsored actor finds them and you find out about them in the wrong sort of way. 

[Patrick] There is one other thing I think that’s a big deal that’s come in the CRA and NIST 2 in Europe. The reality is this is not gonna become optional anymore for organizations doing business in Europe. 

[Paul] Just to be clear, that’s the Cyber Resilience Act in the European Union.

[Patrick] Yes. 

[Paul] It’s not optional and with good reason. 

[Patrick] Yeah. Most products and companies that are substantial are doing business in Europe. So from my standpoint, these requirements are mandated, and it’s required to do disclosure.

[Patrick] It’s required to disclose exploitation evidence. You have to do it in very quick time frames to provide visibility downstream to consumers that products being exploited. People need to know that they need to get mitigating controls in place or other things. And so I think, naturally, some of the things we’re talking about, we’re just gonna see faster time to disclosure, more exploitation evidence, more transparency, generally speaking. You should be really mindful.

[Patrick] If you are already doing business through Europe or plan to go to Europe, you’re gonna have to take this stuff a lot more seriously than historically you have been. 

[Paul] And this is not just the governments of Europe being tucky or wielding their power or showing the opposite. As Joe has put it before, it’s kind of like the stick for those companies with whom the carrot does not work. 

[Patrick] Yeah. 

[Paul] You owe it to your customers to know what ingredients go into the recipe.

[Paul] And if you won’t do it yourself, then you’re going to have to come to the party anyway. 

[Patrick] And you have to continue maintaining the products with free security updates for five years from the point of when you end of life them. So there’s just a lot of things that are good that the consumer should have been getting for years now. Probably gonna be a little bit of a mess, but I’m looking forward to it. 

[Paul] But, Joe, as you’ve said in the past about the CRA, it is perfectly reasonable for somebody who’s going to invest in a product that they would like to use for, say, n years to be told unequivocally upfront that it will be supported for those n years.

[Paul] And it should never become a sales gimmick for the vendor to say, oh, well, to get the patches, we’ve got a much more secure product, but you’ll have to buy all over again. Not only is that unethical way to do sales, in the embedded device market, it’s kind of an impossible way to run things, isn’t it? Because you can’t just go and change everybody’s pacemaker overnight. 

[Joe] Yeah. And especially in the medical device area where we do see a high percentage of buyers asking about security, one, because the consequence is, you know, significant in that industry, but two, they are making those capital expenses as we talked about.

[Joe] And so I think the concern over liability, the concern over safety, the concern over having products that are relevant without getting egg on your face, these are all motivators to help boost your overall software processes. And I think if we elevate our software process, we make more mature software processes that come with discipline that will improve your software from a security perspective as well. And that ends up being a differentiator. It could be a differentiator in the short term. It’ll certainly be table stakes for doing business going forward.

[Joe] But in the end, it’s important for the end users to know that their devices will be relevant and not compromised. 

[Paul] So to finish up, gentlemen, maybe I can ask either or both of you one last question. And that is, if you had to give our listeners one key piece of advice about how they could adapt or rethink their approach to intelligence about vulnerabilities and what to do about them, what would you say? 

[Patrick] Driving vendor accountability, I think, is a big one. If you’re buying and consuming a product for someone, you have some ability to be a part of security.

[Patrick] Now the bigger the company, the harder that becomes depending on your spend. But I think that’s one takeaway that can drive a lot of proper behavior regarding the greater good and then segmenting things off the Internet. Like, generally speaking, not everything needs to be Internet connected. Thinking about your strategy as far as how you’re segmenting things from, access to the Internet because all too often, we see embedded devices and OT systems online. And naturally, those are the easiest ones to own.

And so that’s probably the best advice I have. 

[Paul] And if you are going to do that segmentation, the firewall slash secure router slash segmentation product that you choose, you really need to apply Secure by Demand to that so you don’t get into the Ivanti situation where the product that’s supposed to keep you secure is actually the thing that’s introducing an exploitable vulnerability. 

[Patrick] I think on that note too, I generally advise people. It’s like you probably don’t wanna buy the security add on or security product that does that from the OT vendor. That’s not their core business.

[Patrick] Do your due diligence. You’d wanna get a device in between the Internet that’s really gonna hold up. 

[Paul] Joe, what say you?

[Joe] Well, I like the idea of embracing disclosures. And when you do that, you have to answer a bunch of other questions.

[Joe] When you do disclose something, how are you gonna resolve it? How are you gonna share it? How are you gonna communicate it? And so I do like the idea of if you if you’re committed to disclosing your vulnerabilities, your entire organization’s gonna be committed to all the repercussions when something does get disclosed. And when you do that, you actually get ahead of the game instead of becoming reactionary.

[Paul] Exactly. Exactly. 

[Joe] Yep. 

[Paul] It’s not a sign of weakness that you confront their own weaknesses and repair them proactively for the future, is it? 

[Joe] And I do think building security into your products, naturally, you can have other defenses to it.

[Joe] But I do think if you can build security into the products and eliminate exploitation of an entire class of vulnerabilities, then certainly that can extend the life of the products and maintain resilience. But it also can have operational impact. I’m delighted that RunSafe is a partner with VulnCheck, and Patrick’s on the front lines with people out in the industry talking about vulnerabilities and all of that. And, VulnCheck’s a great company, and we love working with VulnCheck here at RunSafe. I just wanna put that plug in.

[Patrick] Thanks. It’s awesome working with RunSafe as well. 

[Paul] Thank you, Joe. Thank you, Patrick. Thank you so much.

[Paul] Very deep thoughts about all of this stuff. But I think the key takeaway is don’t delay. Get started today. And if you confront your weaknesses, it is not a sign of weakness, it is a sign of strength. So that is a wrap for this episode of Exploited: The Cyber Truth.

[Paul] If you find this podcast informative, please don’t forget to subscribe so you can keep up with each week’s episode. Please also share with your colleagues and your friends. And remember everybody, stay ahead of the threat. See you next time.