How does the AI Act affect the Medical Device industry?

How does the AI Act affect the Medical Device industry?

In the previous blogpost we assessed the developments around the upcoming AI Act. Since the agreement between the European Parliament and the European Council the final text has become available as it was leaked online

In this blogpost we will try to examine the potential implications for the medical device industry. Specifically in regards to which medical devices may be impacted, what the requirements entail and what it means in terms of conformity assessment. Take a look at our recent video on the AI Act and it's impact on Medical Devices.

AI-Enabled medical devices

We are constantly confronted with developments around Artificial Intelligence (AI) in the news. New applications now seem to come out almost on a daily basis (e.g. the recent introduction of Gemini by Google). AI applications however, have already been around in the medical field for quite a while. According to the overview of the FDA for example, the first AI-enabled device was already cleared in 1995. Several of the Matrix Requirements customers including Cognoa, harness AI to enable earlier diagnosis of childhood behavioral conditions. The number of clearances in the same listing started to increase as of 2016 due to technological advancements. These advancements led specifically to a large number of companies focusing on image analysis, not surprisingly mainly in the field of radiology, followed by cardiovascular and hematology. Today this is still where AI in the medical field is used the most.

Examples of such radiology devices are those intended to triage patients or to detect, measure, classify and or segment abnormalities on CT scans, X-ray images, MRI scans and mammography. Applications vary from the detection of bone fractures to potentially detecting cancer.

It is expected that with the recent advances in AI, different types of applications within the medical field will find their way onto the market. In specific applications (refer to image 1) that are capable of interpreting and generating text (e.g. Large Language Models, commonly referred to as ‘LLM’s). LLMs that with a medical intended purpose will need clearance from the appropriate regulatory authorities already today (e.g. FDA clearance or CE marking against the MDR 2017/745).

Image 1. Overview of potential LLM medical use-cases, referenced from Article

The AI Act for Medical Devices

Any product today which is intended for use as a medical device per the Medical Device Regulation (article 2, (1)), will already be required to obtain CE marking, either through self-certification (most class I devices) or through a third party conformity assessment (some class I and all class IIa, IIb and III devices). The AI Act might require additional CE marking requirements to be addressed during the conformity assessment procedures, which we will investigate further below.

Definition of AI

A medical device which incorporates Artificial Intelligence (e.g. a CT-scanner that optimizes images with the use of AI), will in addition require compliance against the future AI Act. The basis of the definition of what AI is has been agreed upon between the European Council and the European Parliament, yet the final version in the text is slightly different:

“An AI system is a machine-based system designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments“

According to the European Council this definition should ‘provide sufficiently clear criteria for distinguishing AI from simpler software systems’. If it actually will, remains to be seen in practice.

Conformity assessment

If you come to the conclusion that your device is indeed enabled by a form of AI that is governed by the AI Act, it will be essential to assess whether your device under the Act is a) prohibited, b) High-Risk , c) Limited Risk, or d) Minimal Risk.

High-Risk AI

Specifically, devices covered under the MDR 2017/745 or IVDR 2017/746 (listed in Annex II, Section A), which undergo third-party conformity assessment (Notified Body approval), will (by default) be considered High-Risk under the AI Act, and all requirements for High-Risk will be applicable. In our example of the CT scanner, it would be considered High-Risk since it already is required to obtain CE marking under the MDR. 

In the agreement between the European Council and European Parliament a transition period of 3 years has been settled for all industries noted under Annex II. Annex II Section A includes Medical Devices, which provides some breathing air, yet, very little, as this coincides with the overall MDR and IVDR transition timelines.

All Notified Bodies that assess medical devices (and IVD’s), will need to comply with the AI Act (e.g. have available the relevant expertise). If your Notified Body decides that the AI Act is not relevant to their business or is unable to source the relevant expertise, you as a medical device manufacturer may find yourself in a difficult situation when staying with that Notified Body.

Limited or minimal Risk AI

When your device does not undergo Notified Body approval, it may in specific circumstances still be considered high-risk under Annex III or could be considered a General Purpose AI (GPAI) (high impacting or not) which may impose requirements on the device.  In addition, transparency obligations outlined in Article 52 may apply.

What will be required for High-Risk AI?

Most medical devices that make use of AI are likely to fall into the category of High-Risk, as with our earlier example of a CT scanner using an AI algorithm to improve image quality.

What will this mean for you, if your device falls in this category? The AI Act is divided into several ‘Titles’ and ‘Chapters’, Title III of the Act is specific to High-Risk AI Systems, where Title III specifies the following five chapter for High-Risk AI:

  • Chapter 1: Classification of Systems as High-Risk

  • Chapter 2: Requirements for High-Risk AI Systems

  • Chapter 3: Obligations of Users of High-Risk AI Systems and other Parties

  • Chapter 4: Notifying authorities and Notified Bodies

  • Chapter 5: Standards, Conformity Assessment, Certificates, Registration

As a manufacturer of (High-Risk) Medical Devices that incorporate AI, your main focus should be on chapters 2 & 3, where the obligations for you as manufacturer are defined. As a deployer (e.g. healthcare organization) or user (e.g. a physician) of a medical device that incorporates AI you may also want to pay attention to Chapter 3. This chapter introduces specific requirements for deployers and users of High-Risk AI systems, in addition to those for manufacturers.

When analyzing chapters 2 & 3 you will find many resemblances (or rather ‘duplications’) of requirements with those already existing for medical devices, for example: 

  • Article 9: Execution of Risk Management;

  • Article 11: Having Technical Documentation;

  • Article 13: Having an Instructions for Use;

  • Article 15: Disclosure of the Accuracy in the IFU;

  • Article 16: Having a Product label and being registered;

  • Article 17: Having a Quality Management System (including PMS*);

  • Article 18: Retention of Documentation; 

*= Referred to as Post Market Monitoring

 An interesting note to make specifically in association with Article 11, is that the role of the PRRC defined in the MDR and IVDR (Article 15) will (most likely) automatically be extended to the AI Act. The role of the PRRC per those regulations is to ensure that the Technical Documentation and the EU Declaration of Conformity are drawn up, and kept up-to-date, without specifically referring to the MDR or IVDR.

AI Compatibility with Medical Device requirements

It has been clarified at multiple occasions that the requirements for medical devices using AI should be aligned as much as possible to avoid conflicts in implementation. For quality management it is noted in article 17.2(a) that::

“For providers of high-risk AI systems that are subject to obligations regarding quality management systems or their equivalent function under relevant sectorial Union law, the aspects described in paragraph 1 may be part of the quality management systems pursuant to that law.”

And by the European Commission in the draft standardization request M/593 to CEN/CENELEC, the European Commission notes.

Specifications shall be drafted such that the quality management system aspects related to the AI system may be integrated into the overall management system of the provider, in particular with existing quality management systems established to meet requirements towards quality management systems contained in Union Harmonisation legislation listed in Annex II, Section A of the Artificial Intelligence Act proposal.

“To ensure consistency and avoid unnecessary administrative burden or costs, providers of a product that contains one or more high-risk artificial intelligence system, to which the requirements of this Regulation as well as requirements of the Union harmonization legislation listed in Annex II, Section A apply, should have a flexibility on operational decisions on how to ensure compliance of a product that contains one or more artificial intelligence systems with all applicable requirements of the Union harmonized legislation in a best way.” 

Note that for both risk management and quality management systems additional requirements specified in the AI Act will need to be met. In a future blog post, we will explain in more detail what these additional requirements look like, and how you can address those. Check out our latest video on AI Act with Medical Devices.

New requirements in AI

Each of the articles specified in Chapter 2 & 3 will introduce new and additional requirements for medical devices using AI. Some entirely new requirements include the need to address requirements for:

  • Article 10: Requiring Data and data governance

  • Article 12: Requiring Record-keeping

  • Article 14: Having appropriate Human oversight

  • Article 20: Generation of Automated logs

  • Article 29: Obligations of users of High-Risk AI systems

Where relevant, we will address the new requirements too in future blogposts to support you with their implementation.

AI Implications within Medical Devices

As can be understood from the previous parts, medical device manufacturers need to be aware of the fact that the AI Act is coming soon (mid 2024), and will need to start preparations. Mainly to ensure that they have the right expertise to assess the requirements, and implement those into their existing systems.

The transition period of 3 years might seem long, however, in comparison to the implementation of the MDR and IVDR, this regulation is entirely new. Consequently, there are no Notified Bodies yet that have experience in certification against the AI Act. At the same time many requirements of the AI Act seem to require harmonized standards to support implementation (e.g. the EU Commission has requested through the Draft request a total of 10 standards to be developed by CEN/CENELEC to support the Act, more standards will be added in the final request). With an average of 5 years to develop a harmonized standard, it is doubtful that all will be available when manufacturers will be required to prepare and submit their technical documentation (somewhere along the next 1 to 2 years). That means effectively, manufacturers would already need to start implementing those harmonized standards into their organizations today.

With the timelines of the implementation of the MDR and IVDR coinciding with the AI Act, a gap in regulatory affairs and artificial intelligence experts in the field (among notified bodies, manufacturers, consultants and other industries), it can be expected that this transition might also become a complicated one.

What should you do today to prepare for AI?

In anticipation of the publication of the AI Act in the Official Journal, the internal organization and process of becoming compliant for your Notified Body, you should not sit still. The following tips will help you to be compliant in time with the AI Act.

  1. Verify whether your device makes use of AI per the given definition, if it is High-Risk or whether it may be considered a prohibited AI system (if so, it would be prohibited within 6 months after the coming into force date of the Act);

  2. Inform and train your Management Teams of these developments and the potential impact this will have on your organization, the AI Act will require changes to Quality Management Systems, Technical Documentation and Risk Management procedures. Start planning the needed changes, and your route to compliance and certification;

  3. Reach out to your Notified Body to ensure they intend to comply with the AI Act, if they do not, consider alternative strategies immediately;

  4. If haven’t done so already, join industry organizations that can keep you up-to-date on the developments around the AI Act, for example Medtech Europe or COCIR;

  5. Become a member of your local standardization organization, it will provide you access to all standards that are currently being developed (in JTC21) to address the requirements of the AI Act, and you can support shaping standards that fit within our medical devices industry;

Medical Device companies looking to accelerate development of their innovative medical device technologies come to Matrix Requirements. Our platform is an easy-to-use, flexible, all-in-one software solution that facilitates collaboration of employees on design control, and quality management to streamline medical device design, establish lean quality management, accelerate product certification and go-to-market, and maintain regulatory compliance.

To learn more about our author, Leon Doorn, connect on linkedin.

About the Author
Leon Doorn
Independent Consultant