Secure by Design: Why It’s More Than Another Buzzword

April 24, 2025

As cyber attackers become more advanced, “Secure by Design” has taken center stage as a key strategy for building resilient systems. In this episode of Exploited: The Cyber Truth, host Paul Ducklin welcomes back Joe Saunders, CEO of RunSafe Security, to explore why this philosophy matters—especially for critical infrastructure and national defense.

You’ll learn how Secure by Design is influencing software supply chain standards, what it means to transition to memory-safe languages like Rust, and how developers can protect existing systems with modern exploit mitigation techniques.

Whether you’re a security leader, developer, or policymaker, this conversation breaks down what you need to know to stay ahead of the next threat.

Speakers: 

Paul Ducklin: Paul Ducklin is a computer scientist who has been in cybersecurity since the early days of computer viruses, always at the pointy end, variously working as a specialist programmer, malware reverse-engineer, threat researcher, public speaker, and community educator.

His special skill is explaining even the most complex technical matters in plain English, blasting through the smoke-and-mirror hype that often surrounds cybersecurity topics, and  helping all of us to raise the bar collectively against cyberattackers.

LinkedIn 

Joe Saunders: Joe Saunders is the founder and CEO of RunSafe Security, a pioneer in cyberhardening technology for embedded systems and industrial control systems, currently leading a team of former U.S. government cybersecurity specialists with deep knowledge of how attackers operate. With 25 years of experience in national security and cybersecurity, Joe aims to transform the field by challenging outdated assumptions and disrupting hacker economics. He has built and scaled technology for both private and public sector security needs. Joe has advised and supported multiple security companies, including Kaprica Security, Sovereign Intelligence, Distil Networks, and Analyze Corp. He founded Children’s Voice International, a non-profit aiding displaced, abandoned, and trafficked children.

LinkedIn

Key topics discussed: 

  • Why “Secure by Design” is the future of software development
  • What the CISA pledge means—and why RunSafe joined it
  • How transitioning to Rust addresses deep-rooted memory safety issues
  • Practical ways to harden existing systems without starting from scratch
  • How Secure by Design supports national security and critical infrastructure protection
Episode Transcript

Exploited: The Cyber Truth, a podcast by RunSafe Security. 

[Paul] Welcome back to Exploited, the cyber truth. I am Paul Ducklin, and today, I’m joined by Joe Saunders, CEO and Founder of RunSafe Security. 

[Paul] Hello there, Joe. 

[Joe] Hey, Paul. It’s great to be here as always. 

[Paul] This week, Joe, our topic is Secure by Design, Why It’s More Than Another Buzzword. Now, it’s one of those things that’s easy to say and easy to take for granted. What do you mean? I bought software.

[Paul] You mean it’s not secure? You mean they didn’t care? And in fact, if you think about having something called secure by design, it’s almost as though the opposite is one of two things. It’s either that software that appears to be secure may be secure by good luck, and it’d be much better if that were not the case, if it were actually planned. But the less attractive side of it is that perhaps we need secure by design because some software is or was insecure by intent.

[Paul] In other words, hey, it’s cheaper not to bother, so let’s see if we can get away with it. Secure by design sounds as though it’s necessary for a strange mixture of incompetence and intransigence both. 

[Paul] What do you say to that? 

[Joe] Well, I think there is a reality that humans are not perfect. But we’ve learned a long time ago, if you look at quality programs, there is a need to elevate your software development process to a sufficient level of quality and maturity to make sure you don’t have bugs in your systems and to make sure you don’t have vulnerabilities in your systems.

[Joe] And, unfortunately, a lot of software inherently makes it very easy to overlook things in testing that do become vulnerabilities when the software is deployed out in the field. Ultimately, what we are trying to do as secure by design is to boost code quality and eliminate the likelihood of bugs. And if you think about the best software companies in the world, I’m gonna venture to say that Microsoft has really good software development processes. Google has really good software development processes. Lockheed Martin has really good software development processes and has tons of software engineers.

[Joe] They all produce software that has bugs in them, so even the best software companies in the world produce bugs. So if you go to smaller organizations that are less disciplined, you can almost ensure that there will be, given a variety of talent levels and process levels of maturity, there will be bugs in software code. Secure by Design is a way to say everyone can benefit from boosting their software development, maturity, and and best practices. 

[Paul] Yes, indeed. In last week’s podcast, there was a bit that really struck me.

[Paul] You said, and I’m paraphrasing slightly here, what an organization needs to do is to have systemic improvement that elevates and educates its software development process. It’s not just enough to go on the training course, and it’s not just enough to go, oh, I’ll try a little bit harder. You need both of those things with equal measure to make even a good thing better. 

[Joe] Exactly right. But the end goal is not to be secure per se.

[Joe] The end goal is to produce high value software. And with that said, bugs and vulnerabilities prevent the job from achieving. They do go hand in hand. But if you were pressed for, should I release this product in the age of the Internet, There’s one mindset that says, push the code out there. If it’s broken, we’ll fix it real quick.

[Joe] Now that is the age of web development time, and we’ve gotten way beyond that. We’ve gotten much better with automated tools, with automated testing, with automated deployment. And so there are safeguards now in the process. There is an economic gain to invest in improving your software process. And that’s where education comes in and the sharing of knowledge and everything else.

[Paul] I understand in the early days of Facebook why people jumped on that bandwagon of, hey, move fast and break things. The idea was, why wait? We can push out fixes quickly because of the Internet. But that’s not a good reason to push out bad software just because you can fix it later. We’ve gotten accustomed as consumers that if there’s an application update on their phone, it may just take a couple seconds to reset that application, and then you’re off to the races. You can work or or play your favorite game or do your productivity effort or what have you. This is not true in all industries, though. 

[Joe] One of the issues I have around code quality is that the window of opportunity is actually much greater in other industries. If you think about industrial use cases, let’s say an energy grid, it might be very hard to push out updates around the world to those systems if they’re not Internet facing, if there’s not a clear line to upgrade, if there’s not an automated over the air update that can be pushed. And so in other industries, those bugs and vulnerabilities stay out there much longer.

[Joe] You can imagine that an industrial control system or a PLC, as they say, program logic controller, they might last inside the infrastructure with the software on them for thirty years, whereas a web based application might get updated five times a day. That PLC does the same thing over and over and over again and is capitalized as an expense to the data center over a thirty year period. So we have different economic scenarios to match, and that’s why Secure by Design is not just one size fits all for all industries. You do have to tailor it to your process in your organization and the economics that drive your software development process. 

[Paul] And just because you can push out an update really, really quickly in an emergency in some scenarios, doesn’t mean that it’s necessarily the right way to solve the problem.

[Paul] I mean, we saw within very recent memory, I won’t mention the name, everyone will know who it is, a mainstream EDR software vendor push out not a product update, just a data definition update that blue screened Windows machines around the world. Oh, we desperately need to fix this possible attack vector that crooks might be using, and in the end, the cure was worse than the disease. And also, there are industry sectors, understandably, where the reluctance to push out updates exists for things like regulatory or even medical safety reasons. You wouldn’t like your CAT scanning machine while it’s putting ionizing radiation through your head suddenly to go into the middle of an update halfway through that cycle, would you?

[Paul]  Like you say, there is a question of different strokes for different folks.

[Joe] Yeah. And I also you’ve heard me say this before, but there’s a reason it takes three to five years for a new model year of a car to be introduced and developed. You wouldn’t want it to be released in thirty days or fifteen days. Could you imagine the concerns people would have in safety? Now with that said, the lesson, and I’ll use the name of CrowdStrike 

[Paul] You’re allowed to Joe.

[Joe] The lesson in it is one of testing software before you release it.

[Paul]  Amen, brother. Absolutely. Exactly. That gets to secure by design and and the principles behind it.

[Paul] It’s interesting that you mentioned the automotive industry, because I can’t think of a better example of where the motto move fast and break things is ultra counterintuitive.

[Joe] Exactly right. And I wanna be fair to CrowdStrike. They own the problem. They stepped into it.

[Joe] They led without shame. They led with, we acknowledge responsibility, and we’re gonna work with people to get this right. I thank the world of their leadership for the way they handled the situation. So they’re a vital company to the cybersecurity community and to business operations in general, and you know they took that serious and responded with all the appropriateness.

[Paul] I agree.

[Paul] I’m sure they wish it didn’t happen. But in the end, they managed to make it much less of an issue than the naysayers were screaming about when it first happened. So Joe Caesar in The United States, the cybersecurity and infrastructure security agency, actually has a secure by design pledge that software companies can sign up for, and their website currently says there are 300 companies. And one of the initial adopters was RunSafe Security. So what’s the pledge all about, and why did you sign it?

[Joe] Well, certainly, the pledge is a commitment to elevate your software development practices and build code quality and security into your software development process in general. We want fewer bugs and a higher code quality rate because bugs and vulnerabilities lead to compromise out in the field, especially when critical infrastructure is at stake. So it’s a national security issue. It’s an issue of protecting critical infrastructure, but it’s also protecting consumers and users of mobile devices and consumer applications. And Secure by Design then ultimately was not a mandate with a stick behind it.

[Joe] If you don’t do this, you’ll get punished. It was a mandate with best practices combined with an education program and a carrot that says you’ll be much better off if you follow these principles. That’s the motivation behind Secure by Design ultimately. And with that said, the pledge was part of that carrot. And for RunSafe, it was very, very important, although I had an issue with the first part, when it was first released, which I’m happy to talk about.

[Joe] But when we signed, it was very, very important for us to demonstrate our leadership and our commitment to code quality because as a start up company with obviously great backers behind us, our customers expect high quality, and we don’t wanna be the weak link in their software supply chain ourselves. So we wanted to lead by example and adopt Secure by Design. And I have to give credit to my CTO, Shane Fry, for embracing it and my cofounder, Doug Britton, for staying on top of Shane as they talked about what should we do when Secure by Design comes out, and what are we doing to promote best practices ourselves. It was a team effort that made it very obvious that we should sign the pledge, and, ultimately, it was to better serve our customers and give us a good common framework and road map to deliver on that promise. 

[Paul] Well, Joe, you just said something a few moments ago that I do want to take you up on and form it into, if you like, a sort of double sided question.

[Paul] What did you have an issue with at the beginning that you had to work your way through? And having decided that this was something that you thought it was important to get behind and be part of, what have you and RunSafe Security actually learned? How have you benefited from actually signing up to something which sounds like quite a serious commitment? In other words, companies might be afraid of doing it because they think, oh, it’ll just be cost and no benefit. 

[Joe] Well, I couldn’t have asked a better question, and I’m glad it wasn’t scripted this way.

[Joe] But, my initial issue and our lesson go hand in hand. What I think, CISA, and and I present this to them and talk to them, and I talk to ONCD a bit about it, and I talk to industry players about it as well, is what CISA originally had expected is that organizations would simply rewrite all their software in a memory safe language, denying those vulnerabilities that could be introduced that are part of that initial guidance to achieve memory safety and software. So that means rewriting all your products in Rust. It just took a little bit of time. The folks at Cisco were very open to engaging and working through maybe slight modifications and things like that.

[Joe] There are 800,000,000,000 lines of software code and critical infrastructure that represent existing software. And you can’t tell me it makes any economic sense to rewrite all that software in a memory safe language at the get go. And so it’s impossible. You think about a weapons program, would never rewrite their software written in C or C++ in Rust, nor would Schneider Electric, nor would Duke Energy want Schneider Electric to just replace the devices where they had a ten, twenty, thirty year capital expense to amortize. And so that was the issue I had upfront, trying to help the industry and policy makers and CISA understand the implications of the expectation that industrial software development and embedded software developers needed to rewrite their software in Rust.

[Joe] Once that was changed, we were full on board, and we wanted to lead from the front. 

[Paul] Because, Joe, if you think about just ripping and replacing, it sounds like a great idea, but, a, as you said, 800,000,000,000 lines of code, and, b, particularly for older devices, there may not be mature development tools that even let you do it. There is a sense that in some cases, we do have to live with the past, but we don’t necessarily have to live with the sins of the past. We can take something that was imperfect and actually improve it anyway without ripping it out and replacing it entirely. 

[Joe] Yes. Along those lines then, there have been some lessons that we’ve learned in going through it, which I also wanna highlight in RunSafe rewriting our software in Rust. And so we did it ourselves. It’s one of the principles in Secure by Design is to rewrite your software into memory safe languages. And there were a number of benefits, and there were some shortcomings that we discovered in that process. But we initially had one engineer rewrite about 30,000 lines of code, converting it from c++ to Rust.

[Joe] Guess what? That took a first time effort to refactor, rewrite the software. It took about three months getting everything ironed out and integrated and tested. You know, it’s not a straightforward, easy push button that you can just convert the software. With that said, we saw a reduction in file sizes.

[Joe] We saw some performance improvements over c plus plus We also had some other good surprises improving the structure of our code. But then there were still two other problems we saw. There were some compatibility issues with other components that we had, so not all of our software even in scope could be rewritten in Rust. And finally, maybe the biggest lesson for us, at least today, Rust software software written in Rust cannot be certified for safety of flight and other safety compliance programs. There’s some limitations there.

[Joe] We were able to share all this. Secure by Design has a program. My colleague, Shane, our CTO, Shane Fry, testified before Congress and the subcommittee for critical infrastructure protection, last December on our journey to Secure by Design and the benefits of it. And we also speak to peer groups organized by CISA about lessons. The c to Rust conversion in our process there really puts some benchmarks out there for other people to aspire to in terms of how they might convert their code to a memory safe language.

[Paul] I think there are quite a few people who aren’t programmers themselves or who don’t run a technology company who have a quick look around on the Internet and see, hey, there are tools that can do that. I mean, C is just a programming language. Rust is kind of the same thing, but with a different compiler that tries to be more careful. Why don’t you just take the C code, stick it into this magic machine, out comes the Rust code, and you’re done. And I was intrigued to go looking at some of these tools.

[Paul] Perhaps the best known one is C to Rust. And, what’s great is that they don’t overpromise. They do have a little diagram on their GitHub page, and it shows stage zero, insecure C code. And then there’s some gear wheels that you see the code going through, and it says stage one, Rust code, brackets, insecure. Then there’s a little box with a human sitting in it, and there’s a loop in there that goes back through that human several times.

[Paul] So even if you can translate the code automatically or transpile it, meaning convert it from one compiler to another, it doesn’t magically make it correct. After all, if the whole purpose is to fix the fact that it wasn’t correct before, why would you expect it magically to come out correct on the other side? 

[Joe] 100%. You know, I think there’s an acknowledgment here that not all software will even be attempted to be rewritten. We all know that legacy code sits out there for a long time.

[Joe] I simply can’t imagine you would have to double the defense budget, I think, to rewrite all weapons programs in a new programming language for these reasons. It’s not an automated process. There’s not a large language model that’s just gonna do it magically in a couple hours or a couple days or a couple weeks even. That just doesn’t exist. And, of course, that’s the goal, and we all wanna get to that state.

[Joe] And there’s a lot of innovation happening, in generative AI to improve, you know, software development processes for sure. And I know many organizations that incorporate Gen AI into their software development life cycle, but they all have that human in the loop as you described with the seed of Rust. And that was part of the education that we tried to bring RunSafe to the industry and to CISA about the reality of this. And that’s why if I go back to that NSA guidance from 2022, it said rewrite your software and memory safe language or implement other forms of mitigation. And I think what CISA missed originally was the or.

[Joe] And so we got that included. 

[Paul] It sounds as though they thought it was an exclusive or. You pick one or the other and they went, okay. Let’s go for the bigger, better sounding one. But you’re right, there’s a mixture of both.

[Paul] As indeed, you explained that you found, after you’d converted all this stuff to Rust, there were some system components, either because of the nature of the hardware itself or presumably because of components that you had to rely on other people’s code to manage that needed essentially unsafe rust carefully curated by a human to interact with. So it’s impossible just to rewrite everything. There’s a balance even if you want to try and change as much as you can. 

[Joe] Yep. And it ends up being like any other software development project.

[Joe] So how much money are you willing to spend? How much time are you willing to take? And what is that scope of software that you wanna rewrite and for what reason? And so, another lesson we had was to really prioritize the sections in your code that you may wanna rewrite in memory safe language. But you have to factor that technical feasibility of compatibility with other components.

[Joe] It’s not as straightforward as just push the magic button. But with that said, the intent is there to elevate everyone’s security in their code by developing this North Star, secure by design, and having principles and having people pledge. And in that regard, I think CISA has done industry a great service without mandating it. 

[Paul] Now, Joe, there are two sides to the coin of the pledge. They’re not opposing sides, they’re complementary, but they are quite different.

[Paul] One is that there are a few things that you jolly well expect people will stop doing. The very basics said, if you still have default passwords in your product, then please folks, we’ve known since 1943 that you shouldn’t do that. Let’s stop doing that. But the big part of it is not about what you don’t do. It’s what many companies need to start doing that maybe they haven’t crossed yet.

[Paul] And that is learning how to live in a world where you don’t get to brush security problems under the carpet. So I’m talking about vulnerability disclosure, vulnerability reporting programs, the ability for customers to report things without facing a lawyer instead of a techie. Can you just expand on that a little bit, the stuff about vulnerability disclosure and how to make yourself a richer company from your, what you might even call, your social contract? 

[Joe] Absolutely. If I go back ten or fifteen years ago, we would see organizations like Uber and others.

[Joe] They might not disclose a major security vulnerability. Society has come a long way. CISOs are champions of trying to disclose sooner, and corporate boards have learned. You know, it may sound like I’m picking on Uber. I’m not.

[Joe] If you didn’t have the right communication policy and crisis management, those decisions become very hard in the moment to figure out what is right. But Secure by Design is meant to address that. You’ve got your development practices improved, but you have your vulnerability disclosure policy in place. You are issuing security updates. You are managing, CVEs and and working with those and disclosing.

[Joe] The key is disclosure. If you can get your practices in place where you’re confident to disclose, then everybody wins. Your customers win. You win. These all tie together.

[Joe] That’s why I think what CISA did is so good, reducing entire classes of vulnerabilities. If you’re able to eliminate memory based vulnerabilities, then imagine what that means if you have a a future disclosure requirement. So if you’re protected from software memory being exploited and you have a memory vulnerability, you should be confident to disclose it. And so what we wanna do at RunSafe is help people write those disclosures as part of our process. That’s how we’re going to engage with people.

[Joe] Part of the benefit of adding in security exploit prevention is to be more confident to disclose. And so security patches, reducing entire classes of vulnerabilities, vulnerability disclosure, they all go hand in hand. And using that as a form of achieving transparency means that you’re then engaging with your customers. So I think they all go hand in hand and they tie together. And it’s almost like if you get the foundation right, then these other things make sense because they all start working in collaboration with each other.

[Paul] It’s hard to see how you are not in a better state morally and economically than if you try to pretend it never happened. 

[Joe] Yeah. 100 %. And that’s why I think the timing is right for Secure by Design in the past couple years. If memory based vulnerabilities had existed for 30 and there wasn’t a clear solution other than sweeping under the rug or chasing, fixing, and patching, that’s a losing proposition.

[Joe] Right? It slows everyone down. Wanna change the game, and I say the NSA did everyone a service by putting that guidance out there. 

[Paul] So, Joe, fascinating to hear what you as a company have learned, why you think it’s made not just you and the company better, but actually actively benefited your customers and the whole community. It’s great that you’ve acted because you thought that it was a direction you ought to move in, as I said, both morally and economically.

[Paul] Maybe we can wrap up by talking about something that CISA has done, which is Secure by Demand, which is not a pledge, but it’s a whole load of guidelines that they created specifically to be a counterpart to secure by design. Why don’t you tell our listeners what Secure by Demand is all about and how they, if they’re not programmers, but they are consumers of products, how they can get stuck into it? 

[Joe] Well, I think it is a natural compliment to Secure by Design. But if you are a recipient of software from someone, you may want to understand what their software practices are. And there are a lot of different things that organizations could do, and there’s many standards out there that might work.

[Joe] One of them is to teach people what are the important things to ask for from your suppliers about their software development processes and about the software that they ship. And so you can imagine asking for an SBOM enhances the ability of the software supplier to be transparent about the components in their software. 

[Paul] Joe, just for our listeners, SBOM, software bomb, sounds like a bit of a disaster, doesn’t it? But it’s a software bill of materials. So it’s kind of like when you buy foodstuffs these days and you take a jar off the shelf, you expect to be able to turn it round and see not what’s supposed to be in it or what it tastes like, but what is actually in it so you can make a much more informed decision.

[Paul] And again, nothing swept under the carpet. It also means if you need to fix something, like if suddenly an additive to foodstuffs is deemed unsafe, the supplier knows which products they have to change. Because if they don’t know, what on earth are they going to do about it? 

[Joe] And imagine so part of this comes from the whole Log Four J scenario. 

[Paul] Oh, one of the many Christmas presents that the cybersecurity industry got dumped with.

And it wreaked havoc on people’s time. 

[Paul]  For months and months and months. 

[Joe] And people didn’t know if they had the Log Four J component in the software that they received from their supplier. So guess what? There were a gazillion phone calls going back and forth.

[Joe] There were questions. Even the suppliers didn’t know. I don’t even know if that’s in my software. I have to go analyze it. I have to figure it out.

[Joe] I have to ask my supply chain. I have to ask my third party providers. I have to look in my open source software. And so if you can standardize that process, all this comes back to secure by demand. If you’re a customer receiving software, you should want a software bill of materials or an SBOM to come with it.

[Joe] And that way, you know what’s in it. You can communicate vulnerabilities so you get to the disclosure side of it. You can communicate CVEs and fixes, and what are the CVEs associated with these components in the software they just received from you. The Secure by Demand is a way for you to ask good questions of your suppliers, and it naturally puts pressure on the suppliers to adopt Secure by Design principles. So when Secure by Demand companies are demanding them, then they know what they need to deliver on.

[Joe] It’s meant to reinforce each other, but it’s also meant to elevate that transparency about what’s in the software for the benefit of boosting the security posture, not only of the software that’s being shipped, but the asset owner or the enterprise that’s receiving the software, who’s managing their software infrastructure, and doesn’t want that to go down. 

[Paul] And it’s not just about teaching customers to be difficult and learn how to point fingers and negotiate prices down, is it? There is an element, to borrow your words, that helps purchasers, software users, elevate and educate not only their procurement process, but also the processes that they have in their business for watching out for, reporting, and dealing with bugs if they find them. So it also makes you a better corporate citizen. Just sitting on your hands and waiting for the vendor to fix it is one way to deal with it.

[Paul] But it’s much better if even though, essentially, you’re the victim, if you can actually come out and help the community fix the problem faster, better, and more permanently. 

[Joe] Absolutely. And then I think within that, if you get that standard way of operating within your supply chain or your software supply chain, that lends itself to then even more automation. So if there are expectations on delivery of fixes and updates, well, there’s two sides to that coin. Someone’s gonna ship a fix or a patch.

[Joe] Someone has to apply the patch. If those tools used are more automated and more consistent and predictable, then everybody wins and we’re more efficient. So, ultimately, we can be more efficient with our security practice by producing better quality code and having more complete integration across the supply chain or the software supply chain to deliver bug free software. 

[Paul] So, Joe, would you agree that this is one clear example where even fierce competitors in an economic space can actually compete better if they cooperate on the things that matter, in particular, standing up against people who want us harm, want to steal our money, our intellectual property, our civil rights, or whatever it might be?

 [Joe] I do.

[Joe] And what I would say is if you’re in an industry where it’s highly competitive, then security does become a differentiator of sorts for people. And what is not defensible, what will not earn you premiums in your products that you sell is to say, oh, we just are working harder and putting more people on our security research team. That’s not enough. And so elevating your end to end software development process, integrating tools that elevate your security posture, adding in security at build time, ensuring that you have less buggy code, less rate of bugs in your code, All of that adds together. So it’s not defensible to say, we’ll just add more people at the end to improve our patching process.

[Joe] So to manage the one metric avoiding downtime or or mean time to resolution. That’s not defensible. And in fact, though that’s exactly what we wanna get out of. We want complete improvement across the board from supply chain through runtime of software so that everybody’s security posture is elevated. 

[Paul] Absolutely.

[Paul] You mentioned Log Four J, which was a bug in a Java logging library that could actually be exploited from afar. And you might not even use Java in your product, but if you pass your data on to the next guy, they could get toppled. I think that was a great example of why it’s not acceptable if you’re in a competitive environment just to go, you know what? I’m not affected by this. I don’t use Java. I don’t use Log Four J. I’m just gonna stand aside and let the bowling ball knock down the Skittles that belong to the other guys. 

[Joe] Yes. And so, ultimately, I think elevating our software development practices, elevating our transparency, elevating visibility, automating processes. These are the right ways to invest in your software development process so that everybody gains in this everyone’s security posture raises up.

[Paul] Joe, I think that’s a fantastic way to finish up, and I think it really reinforces our title, if I can go back to that. Secure by Design, It’s Not Just a Buzzword. It’s almost a way of life that if you adopt it, will improve you and your colleagues and your company and your customers and your competitors and society and everybody. So thank you so much for your time and passion again, Joe. Thanks to everybody who tuned in and listened.

[Paul] That’s a wrap for this episode of Exploited the Cyber Truth. If you found this insightful, please be sure to subscribe to the podcast and share it with everyone else in your team. Remember, stay ahead of the threat. See you next time.

Volt Typhoon and the Risk to Critical Infrastructure

Volt Typhoon and the Risk to Critical Infrastructure

  The first episode of “Exploited: The Cyber Truth,” a podcast brought to you by RunSafe Security, features an engaging conversation between host Paul Ducklin and Joe Saunders, CEO and founder of RunSafe Security. The discussion focuses on Volt Typhoon—a...

read more
Madison Horn: “Understand the Why”

Madison Horn: “Understand the Why”

Today's guest is Madison Horn, CEO of Critical Fault and former US Senate nominee. In today’s episode, Madison discusses Critical Fault and her role there, how trends in cybersecurity have changed over the past 10 years, her thoughts on the Biden administration’s...

read more