Navigating the AI Regulatory Maze in Medical Device Manufacturing

Analog Devices and Hologic experts explain how to inventory AI safety components, validate based on risk context, and align three regulatory frameworks in medical device manufacturing.

Claire Wallace, Senior Writer, Informa Markets – Engineering, March 31, 2026 –

Three frameworks. Multiple AI use cases. One compliance challenge: How do medical device manufacturers align the EU AI Act, MDR, and ISO/IEC 42001 when each evaluates AI through a different lens?

Attrayee Chakraborty, Quality Systems Engineer at Analog Devices, and Geethapriya Setty

Manager, Global Regulatory Affairs at Hologic Inc., connected with MD+DI to discuss where to start, how to inventory AI as “safety components” across your product lifecycle, and why validation strategies must shift based on risk context—not just regulatory requirements.

The duo will expand on these subjects and more at MD&M South in Charlotte, North Carolina, on April 23.

The EU AI Act, MDR, and ISO/IEC 42001 each bring their own requirements. Where should manufacturers start when trying to align all three?

Setty: AI is not a finished device, it’s the safety component inside the manufacturing ecosystem. From that perspective under the AI Act, that’s a safety component. If the failure of that piece, or that safety component, could endanger the patient or product safety, then in that case it is treated as a high risk AI system even if it never appears on the product label. And that’s where I think ISO 42001 comes into place, because that gives you a structured way to inventory your AI use cases across a product’s life cycle, and define when an AI is a safety relevant or a component feature and when it is embedded, and when you have to add AI specific controls into your existing quality management and supply management systems. So what it means is you need to have a clear intended use statement for each of those use cases. Your algorithms, and also your risk assessment documents, which both cover product safety, as well as patient safety. So I think that is where manufacturers need to focus their attention.

Related: FTC Launches “Healthcare Task Force”

Chakraborty: Building off of that, if you look at ISO 42001 and the EU AI Act, conceptually they are similar. They have the same concepts of governance, data systems, human oversight, and all of that. It is just different ways to approach a certain problem. We have a paper where we elucidated in detail about how different manufacturers can meet both requirements. Typically manufacturers can map the different requirements first, in terms of where they are with QMS maturity, and then adapt to fill in those gaps. There are different ways of approaching a single problem.

READ THE REST HERE

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use