Security Without Code Changes: A Path Forward for FDA Compliance

June 05, 2025
 

In this episode of Exploited: The Cyber Truth, host Paul Ducklin dives into the urgent challenge of securing legacy medical devices in today’s increasingly regulated cybersecurity environment.

Guests Phil Englert, Vice President of Medical Device Security at Health-ISAC, and Joe Saunders, Founder and CEO of RunSafe Security, explore what’s realistically possible for healthcare organizations and device manufacturers facing FDA expectations—when rewriting code just isn’t on the table.

From Software Bill of Materials (SBOMs) and shared accountability to the tension between patching and continuity of care, this conversation offers a clear-eyed look at how to secure critical systems with limited options.

Whether you’re a security leader, compliance officer, or device manufacturer, this episode offers practical insights for reducing risk while keeping devices online and patients safe.

Speakers: 

Paul Ducklin: Paul Ducklin is a computer scientist who has been in cybersecurity since the early days of computer viruses, always at the pointy end, variously working as a specialist programmer, malware reverse-engineer, threat researcher, public speaker, and community educator.

His special skill is explaining even the most complex technical matters in plain English, blasting through the smoke-and-mirror hype that often surrounds cybersecurity topics, and  helping all of us to raise the bar collectively against cyberattackers.

LinkedIn 


Joe Saunders:
Joe Saunders is the founder and CEO of RunSafe Security, a pioneer in cyberhardening technology for embedded systems and industrial control systems, currently leading a team of former U.S. government cybersecurity specialists with deep knowledge of how attackers operate. With 25 years of experience in national security and cybersecurity, Joe aims to transform the field by challenging outdated assumptions and disrupting hacker economics. He has built and scaled technology for both private and public sector security needs. Joe has advised and supported multiple security companies, including Kaprica Security, Sovereign Intelligence, Distil Networks, and Analyze Corp. He founded Children’s Voice International, a non-profit aiding displaced, abandoned, and trafficked children.

LinkedIn

Guest Speaker – Phil Englert, VP of Medical Device Security at Health-ISAC

Phil works with Medical Device Manufacturers (MDMs) to help improve privacy and security while coordinating with Health Delivery Organizations (HDOs) to ensure implementations are practical and achievable. Phil is a subject matter expert and contributor to Health-ISAC’s Medical Device Security Information Sharing Council (MDSISC). Phil is also active in the cyber security community and contributes to regulatory and standards efforts including HSCC, MITA, CISA, AAMI, MDIC, & MITRE. He has over 30 years of technical and operational leadership experience in healthcare and life sciences. Previous positions include Chief Product Officer at Medsec, Global Leader for Medical Device Technology at Deloitte, Vice President of Operations at MDISS, and National Director of Technology Operations at Catholic Health Initiatives.

LinkedIn

 

Key topics discussed: 

  • What “security without code changes” really means in healthcare
  • Why legacy medical devices are difficult to secure—and what still can be done
  • The importance of SBOMs for transparency and visibility
  • How evolving FDA expectations are influencing both manufacturers and providers
  • Why cybersecurity and patient safety must now go hand in hand
Episode Transcript

Exploited: The Cyber Truth, a podcast by RunSafe Security. 

[Paul] Hello, everybody, and welcome back to Exploited: The Cyber Truth. And our topic this week is security without code changes, a path forward for FDA compliance. I am Paul Ducklin, joined by Joe Saunders, CEO and founder of RunSafe Security. 

[Paul] Hello, Joe. Welcome back. 

[Joe] Welcome. Thank you. 

[Paul] And we have Phil Englert, who is VP of Medical Device Security at Health-ISAC. 

[Paul] Hello, Phil. Great to have you. 

[Phil] Thanks for having me on, Paul. Good to meet you, Joe. 

[Paul] Now for those who don’t know what ISAC means, it’s short for Information Sharing and Analysis Center. So, Phil, you’re an industry body, a global industry body for the health sector, and you are all about sharing and community action to make cybersecurity a better thing for everybody, aren’t you?

[Phil] Yeah. So our whole purpose is to create mechanisms for our members, and there’s 900 member organizations. We like to say you have 12,000 analysts working for you, but they share infosec information across the health care sector, whether that’s manufacturers, big pharma, health care delivery organizations, HIEs, the health information exchanges, or the software manufacturers, the EHR companies. If we can share information often and early, we believe we can help our members defend, detect, and respond in a much more rapid manner. And, indeed, things that happen in one part of the health care industry can have traumatic and devastating effects on other parts even in ways that the average person probably doesn’t think.

[Paul] And an example that we were speaking about before the podcast was, I believe it was a blood bank in New York that was unable to process blood to give to people having surgery or accident victims because the labeling company had been hit by a cyberattack. 

[Phil] Yeah. So it was the labeling machines in the labs. They couldn’t ship any of their blood products. I don’t think the labeling machines were targeted as such.

[Phil] I think they were caught up because they were vulnerable because they’re IoT devices that didn’t have the security controls that you would hope to be present on them. 

[Paul] Yes. Because you think, oh, it’s a labeling machine. Like, how hard can it be? I’ve got a label printer at home.

[Phil] And who would want to hack it? 

[Paul] But, you know, if you’re about to receive blood from somebody else, it’s rather important that there’s no mix up. 

[Paul] Phil, perhaps you can say something about how the issues concerning the security of medical devices, and in particular, complying with FDA regulations for medical devices, has evolved and changed over your extensive experience in the industry. 

[Phil] When I was servicing med devices, cybersecurity was not a concern. As a matter of fact, things like cardiac packs was a server underneath the cardiologist’s desks.

[Phil] Physiological monitoring systems, there was one for each department, and it was not connected beyond the equipment closet at the end of that department. So we’ve connected these devices for a number of reasons, for building efficiency, operating efficiency within health care, for collecting the information and being able to show accountable care improvement. And if you think a medium sized hospital, let’s say 300 beds in The United States today, will produce about 1.37 terabytes of data a day. 

[Paul] A day? 

[Phil] It’s crazy.

[Phil] And a lot of it comes off of the extraordinary diagnostic imaging that we can do today. 

[Paul] Right. 

[Phil] We’ve painted a large target on our back because there’s patient data in there. And we have in The United States, we have the HIPAA regulations. When they first came out, we’re to foster data exchange between entities.

[Phil] If you leave your job and go to another job, if you leave your care provider and go to another care provider, that information could be shifted. That was the impetus for for HIPAA. We realized that in sharing that data, we created a lot of security risks around protecting that data as well. And a few years later, we came out with the high-tech rules, which were very similar from a technology perspective to PCI. We’re exchanging data.

[Phil] We just need to protect the transactions and the data in route and on the endpoints. So when a health care organization gets breached, their first mission is to restore services. 

[Paul] There are some very specific regulations in health care from the FDA, aren’t there, that means that the attitude that we take towards updating things like mobile phones or, as Joe and I like to talk about, web apps, which you can literally do between one web visitor and the next. That kind of let’s just patch it and see, does not exist in the medical profession, does it? 

[Phil] So it doesn’t broadly exist.

[Phil] I won’t say that it doesn’t exist at all. I think we’re beginning to develop these techniques that allow us to update on the fly, and update at a much more rapid rate. But for a lot of legacy devices and devices that were not designed to be updated, that is a challenge. 

[Paul] Joe, this is reminiscent of a talk we had in an earlier podcast, isn’t it? Where you were talking about a very, very simple kind of device such as an actuated that operates a valve for flood defenses or something like that, where the device has exactly one function and it must perform it exactly correctly every time within a certain period with no it’s going to take a little bit longer this time.

[Paul] It’s the same sort of problem, isn’t it? That changes might feel necessary, but at the same time, they bring a risk. 

[Joe] Absolutely. And we obviously realize that medical devices need to be reliable. They need to be rugged to withstand the conditions.

[Joe] And, also, they have to last for a long time because health care systems and providers are making investments in the technology, and they need to use that technology over a good extended period of time. There are hundreds of millions of outpatient visits. There’s hundreds of millions, if not billions, of physician office visits. There are billions of prescriptions. And, you know, let’s face it, the health care industry in general is one of the major sectors in our economy and is very vital to the well-being of our society.

[Joe] And so if you think about the importance of making sure medical devices operate normally and reliably, given all the use and given all the conditions in which they operate, safety is, you know, obviously paramount for such devices and working in ways where they’re reliable from the start. So there’s good reason to have organizations like FDA and others making sure that medical devices meet certain standards and regulations. 

[Paul] So in a world of that sort, if you need to retrofit additional security and safety to devices that aren’t likely to get changed out for another five, ten, maybe even fifteen years, How can you remain compliant with strict regulations about correctness without rewriting the code entirely or without starting afresh? 

[Joe] I mean, from my perspective, building security into the products before, say, a product hits The US market and having a process to do that. And certainly, then as a result of that, you wanna have ways to make sure those devices remain resilient and have security built in.

[Joe] And so I do think, having a forward thinking software development process that not only meets the standards of the known vulnerabilities, but having security built in so that it withstands any future disclosures of vulnerabilities on such devices. So there’s certainly ways to do that in the software development process, and certainly identifying all the components that go into a medical device are important. But I’m sure we’ll get into many more of those details as we keep talking. 

[Paul] When we use a term like security without code changes, what mitigations can you apply? Yeah.

[Phil] So there’s a couple of things. Right? One, in today’s litigious world, health care providers, device owners are not in a position to reengineer a device. But as a health care provider, where I go is compensating controls. If this control is not available, what can I do in my environment to compensate for that?

[Phil] And for the most part, for medical devices, you know, it comes down to two things. Can I increase my monitoring, and can I isolate the device? Can I do segmentation? So if it were to be hacked, I’d reduce the blast radius. But those are challenging given the diverse kinds of equipment.

[Phil] Everything from implantable pacemakers, even pressure gauges that are the size of a paper clip perhaps or or the size of a key fob, to the big iron devices, you know, the MRIs that are several tons, helium filled. It takes a forklift to do any kinds of changes on them. So the scope of understanding where all my risks lie is overwhelming. Automation, the ability to monitor network traffic, identify normal communication protocols, and be able to have real time reporting is a very tough thing to do. It requires a lot of strategic planning.

[Phil] You’ve gotta have expertise. You’ve gotta have advanced analytics on the monitoring side. 

[Paul] Joe, you’ve spoken in previous podcasts about how even very simple devices might have hundreds or even thousands of distinct software modules in their supply chain that could affect their security. How do you go about extending that Software Bill of Materials management model to something as complex and as real time as, say, a hospital. 

[Joe] If you can generate Software Bill Materials as you’re producing the devices, the closer you produce the SBOM to the point you produce the software that goes on the medical device, the greater the accuracy of what the SBOM says is in that device with what is actually on that device.

[Joe] And so, trying to keep that as close as possible and being able to communicate that for transparent reasons, downstream users know what’s on those devices, and it gives people a chance to look at what the proper security measures or controls are. 

[Phil] And, you know, from a detected response, if you don’t mind me jumping in here, think about Log4j. 

[Paul] I shouldn’t laugh. Joe mentioned that a couple of podcasts ago because it’s, if you like, the gift that keeps on taking. 

[Phil] So Log4j was a software component that had broad very broad applicability beyond medical devices.

[Phil] I mean, it was everywhere, but we didn’t know where. So go back to 300 manufacturers, and that’s just for medical devices. There’s probably another couple hundred when we come to the other IoT or OT that’s in a health care system, let alone all of the clinical applications and the risk systems, the radiology information systems, the lab information systems, the health care information systems. And it’s not just SBOMs, I think. We have to think about the life cycle of these devices.

[Phil] Things like the manufacturer’s disclosure statement for medical device security, the MDS two, which is a I call it a preflight checklist, but it’s a checklist of controls that are available on medical devices. So there you can take the software, you can take the controls, and with the controls, you can understand what kind of hardening may be present on those. And given the kind of hardening you can under you can better assess, can I absorb this risk into my environment safely? And it’s interesting because we’ve seen our actors change their tactics in response to the hardening that health care has put in. So those are all important factors as we do it, and it’s gonna take cooperation between the manufacturers and the health care delivery organizations to really have those meaningful conversations.

[Phil] And, unfortunately, it has to take place technology by technology. Physiological monitoring is vastly different than clinical diagnostics. The big chemistry analyzers is vastly different than infusion pump technologies, and you can’t apply things broadly across it all.

[Joe] And I think that the point there is the information sharing is the dialogue is the conversation. It’s the sharing of information between manufacturer and the user of the technology ultimately.

[Joe] And that is part of the point, Paul, is that enabling that transparency, what’s on my device, and having that conversation so, people are informed. And simply having the conversation helps to elevate what the security posture ultimately is. 

[Paul] Each side can very definitely help the other, can’t they? It’s not just a question of the purchasers of the devices will insist that some steps are followed and the manufacturers will follow. There’s quite a bit of give and take from both sides.

[Phil] I was gonna say one of the practices we had when I was in industry is when we found devices that didn’t have the security controls that we would like to have. We would negotiate with the manufacturer and we would say, you know, we would love to adopt this technology. We think it has great clinical benefits for our patients and for the population as a whole, but we cannot put it everywhere until you have these controls. Let’s say we have access control or good authentication or maybe MFA. It helped manufacturers understand what we were doing.

[Phil] We weren’t just saying no, but we weren’t giving them the whole enchilada until they had proven their commitment to helping us protect our own infrastructure by providing equipment that was more resilient. 

[Paul] Because there is a point at which you can cut off your nose to spite your face, isn’t there? If you suddenly say, right. There will be no more CT scans and no more MRIs in American hospitals until everything’s rewritten in Rust or all the devices are replaced. Firstly, you can’t afford to replace those devices.

[Paul] Even if you have the funds, you won’t have the forklift to move them in place. And secondly, they do perform a vital function that you can’t really do without because they’re such a useful part of modern diagnostics. 

[Phil] Correct. Our ability to provide unique care to each individual is becoming more and more of a reality. And with that, we’ll have more and more technology, and a lot of that technology is embedded.

[Phil] And embedded equipment like infusion pumps, patient monitors, like EKG carts, electrocardiograms, they’re not traditional Windows box. They have very limited resourcing, not much memory. An implanted cardiac pacemaker has a seven year life. They’re not taking it out to change the batteries. There’s no way to plug it in to change the batteries.

[Phil] They have to put a power pack in there that will last seven years. And so the component, the resources available for those components has to be very small. And for that, you’ve got to be forward thinking.

[Paul] So my understanding is that there are continuous developing changes in the FDA’s regulations surrounding medical device cybersecurity. Of those, what do you think are the most promising changes that we’ve seen for both manufacturers and for users of the devices and, of course, for the patients.

[Phil] When Congress authorized or granted authority to the FDA for cybersecurity, taking on parts of the Patch Act back in December of twenty two, going into effect in March of twenty three, That was an important part because prior to that, the FDA didn’t have statutory authority over cyber. They knew that cyber impacted device safety, but it was often a gray space and many times not a direct line, so it was a hard place for them to play. 

[Paul] So they were sort of still stuck in the past where they deal with, is this drug safe? Will this device put too many X rays through your body? And not so much about what happens if anyone from a script kiddie to a Volt Typhoon wanders in and decides to do something nasty with it.

[Phil] Correct. And the direct line between a device failing or being unavailable and patient harm, it’s not an obvious thing. I’ve talked to a lot of doctors and technicians and nurses, and I asked them the same question. If a device were to fail in use, would you think about that cyber could be the cause? And every single one of them, to my surprise, has answered, I really don’t have training in that area.

[Phil] I wouldn’t know how to analyze it. If I were to take a failed device and hand it to a biomed, they would have, by that time, unplugged it, maybe plugged it back in. The whole application gets rebooted at startup because it boots right into operating mode, and the failure that happened may not be even detectable, after the fact. Those are the design challenges we run against. I think with the changes that the FDA has, they’ve said you must use a secure software development life cycle and think about the entire life cycle.

[Phil] You must also consider the environment in which your device will operate. So not just the box that you sell, but you have to think this box is gonna interface with my EHR or my risk system or my PAC system. And so what kinds of threats or vulnerabilities could those systems render to this? And do I have defenses against that? 

[Joe] These devices aren’t sitting there in a vacuum disconnected from everything else.

[Joe] These are highly interconnected devices. And so looking at the secure product development framework and using that to your advantage to understand what do I need from an authentication perspective, what do I need from making sure I deploy or my devices haven’t been tampered with? 

[Joe] What can I do to ensure that I can prevent exploitation of memory vulnerabilities on these devices? But the fact that these devices are interconnected draws you back to having, as a product manufacturer, medical device maker, it draws you back to having a robust software development process and a secure product development framework. And with that, then you can systematically approach potential risk and do threat modeling and do other things that help you to really prioritize where you need to spend your efforts.

[Paul] Joe, this reminded me very much of some of your comments when we spoke about security issues in the automotive industry. The first name that comes to somebody’s mind would be Ralph Nader, safety belts, airbags, crumple zones. It would not be what happens if someone hacks into my car while I’m driving along. 

[Joe] Well, I think in general, we’ve come so far in the industry. I go back to, say, 2015, and FDA’s gotten really serious about cybersecurity.

[Joe] And fast forward, look at what we’ve done in the past ten plus years or so, developing regulations, developing standards, developing expectations. What do you need to do before you release a product? As a result of that, I think the industry has really come a long way in those ten years. It’s a good thing because the point behind what you’re saying ultimately is safety and patient safety in a safe environment is ultimately the goal. And so cybersecurity is an enabler in that sense.

[Paul] And we can’t go back to the days of yaw that we’ve spoken about in things like pump room controls, where we just assume that, hey, the device is going to be connected to a valve. It’s gonna be locked up in a pump room. That pump room was built in 1890 by Victorian engineers out of sandstone. No one’s getting in there. So the device is going to be perfectly safe against hacking.

[Paul] That’s absolutely not the case for almost any device, but notably, as Phil has explained, in the medical profession, where the average hospital is collecting more than one terabyte of data per day. I should have expected it, but it was still quite a confronting number when you mentioned it to start with. 

[Joe] Even a medieval castle when connected to the Internet is not safe. 

[Paul] Yes.

[Joe] But in this case, interconnected medical devices with all that data, with all the volume, There’s good reason why we wanna have cybersecurity on those devices.

[Paul] And because so much of our ability to deliver health care much better than we ever have in the past is thanks to that connectivity where information can be shared and can be put into the cloud for processing in ways that the server under the desk in the cardiac unit never could have. Right. 

[Phil] I think the other element is and I see manufacturers beginning to do this is to separate the interoperability piece from the clinical functionality so that these devices can be, a, more resilient, and, b, faster to respond to. So if I can fix a communications vulnerability without touching the clinical side, that’s much better. And medical devices are very expensive.

[Phil] A lot of them, they’re twelve, fifteen years old. They’re still performing the clinical function that they were originally purchased for. 

[Paul] Even with apparently infinite money, we still wouldn’t have infinite time, particularly somewhere like a trauma center where they can’t control the number of admissions that they might have. They can’t predict that there might be a terrible accident that we’ll need to give people urgent and immediate and careful attention. All of those things have to be balanced, don’t they?

[Phil] They do. They do. And one of the things we haven’t talked about is the impact on care. There have been studies, I think the first one was at Vanderbilt University, but several have been done, that show that the clinical outcomes decline for a period of eighteen months after a breach at a acute care center. The long tail, the distraction of staff, the extra processes to be put in place, the diversion of resources that would be applied to patient care that get redirected to securing the fortress, if you will.

[Phil] But the other one, and this was also done at the University of San Diego, Doctor Christian Demeth, who’s an ER doc, but also a hacker. 

[Paul] That sounds like a dangerous mix. But I guess if he can use that for good, then hats off to him. 

[Phil] Yes. And and and white hats off to him all the way.

[Paul] Yes. 

[Phil] He’s an amazing an amazing gentleman. He did a study that showed, when a hospital is hacked, it has an impact on every surrounding hospital in that community because they have to deal with the patient overflows that they’re not prepared for, they’re not staffed for, they don’t have the supplies necessary for that. And so there’s an impact on care outcomes even in the surroundings. So so the blast radius is not just your organization. The blast radius encompasses your community organizations as well. 

[Paul] And it’s my understanding that the f in FDA stands for food, and the food industry for very many years has been regulated to tell us what ingredients actually go into the stuff they’re selling us. So it seems quite obvious that this idea of a Software Bill of Materials that goes into the stuff that we use for delivering health care is equally or perhaps even more important. 

[Phil] Absolutely. Absolutely.

[Phil] And I think it was 1969 that the FDA came out with its food labeling requirements. And so we suddenly knew what were in in the cans and in the boxes, and they did this in response to it was a health driven issue. 

[Joe] And I thought you both were going to suggest that we needed to add cybersecurity controls to the food, you know, and protect the next breakfast scramble. But I guess you weren’t gonna go there. When they start cooking themselves, we will.

[Paul] Perhaps I can finish up by putting a question to both of you and give each of you a chance to answer. For health care security professionals who’ve been listening to this episode, what do you think is the single most productive actionable step that they could take right now to improve cybersecurity amongst their medical devices and their medical networks? 

[Joe] I would like to suggest that cybersecurity professionals working in health care systems engage the medical device makers and and really ask about some of the controls and the requirements that are there and to leverage Software Bill of Materials and what they can receive from product manufacturers. I think that collaboration is ultimately the key. And if you look for a form of threat modeling to understand where are my biggest risk factors, what can go wrong, and have those dialogues in those areas, I think everyone’s security posture will improve.

[Joe] And so naturally, I think Software Bill Materials play a role in that, but I think the broader picture is transparency, conversation, dialogue between supplier and health care provider. And when you do that, then everyone’s security posture will elevate. 

[Paul] So there has to be a strong sense of community engagement between and amongst competitors because no one of us can solve this problem on our own, can we? 

[Joe] And I think the visibility around the issues and the dialogue around the issues elevates people’s preparedness to address some of them. 

[Phil] Transparency is the key and understanding what you have.

[Phil] So there’s a few elements that I would add to that. Some of them came out in the proposed rule changes to the HIPAA security rule, and that was have an inventory. If you don’t have an inventory, then get one. If and it’s gonna require tooling. There are a lot of good tools out there that can help you understand what those are.

[Phil] So that’s number one. Two, that recommendation also included mapping your PHI end to end through your systems. Now remember, a hospital operates more like a mall of specialty stores than a single homogeneous unit. You start at admitting and you go to the lab or you go to the emergency room, and then you go to the operating room, and you go to pre op and post op, and then you go to therapy. And these are all independent specialties that use different technologies, have different specialists.

[Phil] And so while they appear to be linked by a chain, they operate very independently, although in the same family, if you will. So knowing where your data is moving and even beyond your organization through and to all of your partners that go into this, all the external partners that you use is key. I think the other piece is beyond transparency of knowing what you have is negotiating what you can do and what you need help with. Another thing to consider is that 60% of the beds in The United States, the biomed devices supporting those beds are supported by third party organizations. So, again, having a good framework to negotiate that support with your partners, whether they’re the manufacturers, whether they’re an independent service organization, or whether it’s internally within your own organization is essential to weaving a very tight tapestry of cybersecurity across this spectrum of tech.

[Paul] So if you’ll forgive me concluding with a cliche slash truism, and as we’ve said in the podcast before, the thing about truisms, they become truisms because they are true. It sounds as though that saying that cybersecurity is best played as a team sport applies as much or more to the health care industry than anywhere else. 

[Phil] I don’t know about anywhere else, but it certainly applies very strongly. The one last thought I would leave you with is 80% of anything is better than a 100% of nothing. So get started where you can with the resources you have.

[Phil] Start with something small, and teach you and your organization how to work through the rest of it. 

[Paul] Well said. Don’t delay. Do it today. 

[Phil] Very good.

[Paul] Phil and Joe, thank you so much for your passionate and thoughtful content. It’s really great to see this kind of community spirit in the cybersecurity world. Thanks to everybody who tuned in and listened. That is a wrap for this episode of Exploited: The Cyber Truth. If you find this podcast informative, please subscribe so you can tune in every week, and don’t forget to share it with everyone in your team.

[Paul] Stay ahead of the threat. See you next time.

Can Companies Actually Get Ahead of Zero Days? Skeptics Talk

Can Companies Actually Get Ahead of Zero Days? Skeptics Talk

  Zero days have long been cybersecurity’s unsolvable puzzle—but are defenders closer than ever to staying ahead? In this episode of Exploited: The Cyber Truth, host Paul Ducklin sits down with Steve Barriault of TrustInSoft and Joe Saunders of RunSafe Security to...

read more
What Every Industrial CISO Needs to Know About Embedded Risk

What Every Industrial CISO Needs to Know About Embedded Risk

  As industrial environments become more automated and interconnected, embedded systems are fast becoming one of the most exploited attack surfaces in OT. In this episode of Exploited: The Cyber Truth, Joseph M. Saunders, Founder and CEO of RunSafe Security, joins...

read more