Skip to content
Mónica Fernández PeñalverMay 14, 20255 min read

AI Medical Software Compliance: Navigating EU Regulations

AI Medical Software Compliance: Navigating EU Regulations
7:15

Artificial intelligence is no longer just a buzzword in healthcare, it's becoming a core part of how AI-driven medical software is developed, used, and regulated. From supporting diagnoses to guiding treatment decisions, AI is increasingly embedded in software governed by the EU Medical Device Regulation (MDR) and the In Vitro Diagnostic Regulation (IVDR).

Now, there's a new layer to the story: the EU AI Act (Regulation (EU) 2024/1689). For companies developing AI-driven medical software solutions, it's essential to understand how these regulations overlap, where they differ, and what that means for bringing your product to market smoothly and confidently.

 

The EU AI Act: What's New and What Matters for Medical Software

AI Medical Software Compliance

The EU AI Act is the European Union's comprehensive framework for regulating artificial intelligence, applying across industries. It categorizes AI systems by risk level, with "high-risk" systems (like many medical software applications) coming under the most scrutiny.

Key definitions from Article 3 of the AI Act include:

AI system: A machine-based system designed to operate with varying levels of autonomy. It may exhibit adaptiveness after deployment and, for explicit or implicit objectives, infers from the input it receives how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments.

General-purpose AI model: An AI model trained using large datasets, often with self-supervision at scale. It demonstrates significant generality and can perform a broad range of distinct tasks. General-purpose AI models can be integrated into various downstream systems or applications, except when used solely for research, development, or prototyping before being placed on the market.

Understanding whether your AI-driven medical software fits one of these definitions is the first step in figuring out what rules apply. The European Commission's AI regulatory framework provides additional guidance on implementation timelines and compliance requirements for developers.

 

MDR and IVDR: Still Fundamental for Medical Software Compliance

Of course, MDR and IVDR haven't gone anywhere. These medical device regulations still apply to software that diagnoses, monitors, or supports treatment, with a focus on clinical safety, performance, and lifecycle management.

What's new is that the AI Act doesn't replace these rules, it stacks on top of them. Understanding the global landscape of AI regulations can help manufacturers of AI-driven medical software navigate these overlapping requirements more effectively.

 

Medical Device Compliance ≠ AI Act Compliance

It's a common misconception: if your AI medical software is MDR-compliant, you're done. Not quite. The EU AI Act adds its own set of obligations, especially if your product is considered high-risk.

Here's what the AI Act expects from high-risk AI systems in medical applications:

  • A risk management system covering the entire lifecycle, not just clinical risks
  • Clear and consistent data governance, including high-quality, unbiased training data
  • Robust technical documentation that tells the full story of how the system works
  • Transparent user instructions to help people understand and trust the output
  • Effective human oversight, meaning humans can monitor, understand, and, when necessary, intervene in the operation of the system
  • Proven accuracy, resilience, and security under realistic conditions
  • A post-market monitoring plan focused on AI-related performance
  • Registration in the EU's official AI database

These requirements are specific to AI and go well beyond the traditional scope of MDR or IVDR. The World Health Organization's guidance on AI ethics in healthcare emphasizes similar principles for responsible deployment of AI-driven medical software in clinical contexts.

How Your MDR Risk Class Affects AI Act Obligations for Medical Software

Your medical software's MDR classification plays a big role in determining how the AI Act applies:

  • If your AI-driven medical software is Class I, it's generally not considered high-risk under the AI Act. That said, there are exceptions, so one should stay aware of these no matter the class.
  • If it falls into Class IIa, IIb, or III, it is highly likely to be considered high-risk under the AI Act. That means full compliance with AI-specific regulatory requirements is expected.

In other words, your MDR class is more than just a medical risk label, it's also a major clue about your AI regulatory compliance obligations.

 

What to Do Next: Use Risk Classification as a Starting Point for Medical Software Compliance

So where do you begin? With risk categorization. It's the smartest way to understand your regulatory exposure and decide what comes next for your AI-driven medical software.

That's where Nemko Digital comes in. We offer a structured risk categorization service that helps manufacturers of medical software with AI capabilities:

  • Determine whether the AI Act applies—and if so, how
  • Understand the compliance path based on the device's risk level
  • Make informed decisions about timelines, documentation, and investment

Depending on the outcome:

  • If your system is likely high-risk, your next step is preparing for AI Act compliance, including technical documentation and conformity assessment.
  • If it's not high-risk, you may decide the available resources to invest in governance maturity, product improvements, or certification.

And here's the best part: in both cases, you're already on the right track to pursue an AI Trust Mark. Risk categorization is step one in that process, so you've already done part of the work. A trust mark shows your commitment to responsible, transparent, and ethical AI-driven medical software - something customers, investors, and regulators increasingly expect.

 

Our Medical Software Compliance Services at a Glance:

  • Clear scoping of which regulations apply (AI Act, MDR, IVDR) to your AI-driven medical software
  • Risk classification support and documentation tools for medical software regulations
  • Guidance for CE marking, EU AI system registration, and audits
  • Strategic roadmaps aligned with your business, compliance, and product goals

 

 

Beyond Compliance: Building Trust in AI-Driven Medical Solutions

AI-Driven Medical Solutions

 

Bringing AI into medical software opens incredible opportunities, but it also raises the bar for compliance. MDR alone isn't enough anymore. If your medical software uses AI in any meaningful way, especially in a clinical context, the EU AI Act likely applies.

Starting with a smart risk assessment gives you direction. Whether your goal is AI regulatory compliance, certification, or building long-term trust in your product, Nemko Digital can help you navigate AI regulatory compliance with confidence.

The FDA's approach to AI/ML-based medical devices provides another important perspective for manufacturers of AI-driven medical software targeting global markets, particularly those looking to enter both EU and US markets simultaneously.

avatar

Mónica Fernández Peñalver

Mónica has actively been involved in projects that advocate for and advance Responsible AI through research, education, and policy. Before joining Nemko, she dedicated herself to exploring the ethical, legal, and social challenges of AI fairness for the detection and mitigation of bias. She holds a master’s degree in Artificial Intelligence from Radboud University and a bachelor’s degree in Neuroscience from the University of Edinburgh.

RELATED ARTICLES