close
close

Implications of EU AI law for medical technology companies | Hogan Lovells


General information about the Artificial Intelligence Act

Proposed by the European Commission in April 2021, the AI ​​Act was approved by the European Parliament on 13 March 2024 and by the Council of the European Union on 21 May 2024 after several rounds of intensive interinstitutional negotiations. On 12 July 2024, the European Union published the final text of the AI ​​Act in the Official Journal of the EU (OJ EU) (available here). The AI ​​Act will enter into force 20 days after its publication in the OJ EU. It will become fully applicable two years later, on 2 August 2026, with the exception of certain requirements that are subject to longer transitional periods.


Scope of the Artificial Intelligence Act

The AI ​​Act is designed to be industry-neutral, with application to a broad range of sectors, including healthcare, medical technology, financial services and consumer products. It applies not only to entities based in the EU, but also has extraterritorial reach, affecting non-EU entities that market, deploy or use AI systems or products incorporating AI in the EU. It also applies to a wide range of economic operators operating in the AI ​​supply chain, including suppliers, importers, distributors and deployers of AI systems, as well as manufacturers of AI products.

Each of these economic operators in the supply chain will have obligations under the AI ​​Act. Manufacturers of medical devices can become suppliers of AI systems under the AI ​​Act. If a medical device is subject to the AI ​​Act, all partners in its supply chain will have to comply with the new requirements of the AI ​​Act.


Risk-Based Approach and Timeline

The AI ​​Act adopts a risk-based approach to AI regulation, categorizing systems into unacceptable, high, medium, and low risk. Systems posing unacceptable risk are prohibited, while those in the high, medium, and low risk categories are subject to varying degrees of regulation, proportionate to the risk they pose. The obligations of economic operators vary depending on the level of risk of the AI ​​system, aiming to balance the need for innovation with the need to protect users from potential harm associated with AI.

Like many other products, medical technologies (including medical device software) may be subject to the provisions of the Artificial Intelligence Act.

AI-enabled medical devices may fall within the definition of a high-risk AI-system if the AI-system is used as a safety feature of the product or the AI-system itself is a product, is covered by Union harmonisation legislation in accordance with Annex I of the AI ​​Act (such as the Medical Devices Regulation (MDR) and the In Vitro Diagnostic Medical Devices Regulation (IVDR)) and is subject to third-party assessment in accordance with such Union harmonisation legislation.

In such a case, the medical device using AI will be subject to both the Artificial Intelligence Act and the MDR or IVDR regulations.


Schedule for the application of the Artificial Intelligence Act to medical devices: August 2, 2027

As mentioned above, the AI ​​Act was published in the Official Journal of the European Union on 12 July 2024, will enter into force on 2 August 2024 and will be fully applicable from 2 August 2026. The prohibited AI systems/practices specified in Article 5 will be removed from the market by 2 February 2025. The AI ​​Office is required to develop codes of conduct no later than 2 May 2025 to enable AI system providers to demonstrate compliance with the applicable requirements set out in the AI ​​Act.

Medical devices that qualify as high-risk AI systems will have an additional year (until August 2, 2027) to comply with the current requirements for high-risk systems. For a more detailed timeline, see below:

The preparation of manufacturers will be crucial, given the time frame. Medical device manufacturers should assess today whether the AI ​​Act will apply to them tomorrow. Although August 2, 2027 seems far away, experience with the MDR and IVDR shows that preparing for the new regulations takes time and resources (e.g. changing the quality management system, changing the risk management system, training, internal audits, adding new staff, generating new data…). As an example, Article 10 of the AI ​​Act requires that AI systems are developed using high-quality data sets for training, validation and testing. Such a requirement should be considered today to avoid a situation where the notified bodies of manufacturers report a serious non-compliance in about three years.


Conformity assessment

To avoid duplication, the AI ​​Act allows for a single assessment of conformity under the MDR or IVDR and the AI ​​Act. This is good news. This means that Medtech companies will be able to have a single assessment of their technical documentation and quality management system by their Notified Body under both the AI ​​Act and the MDR or IVDR.

In practice, however, such a combined conformity assessment is not always possible:

  • Designation under the Artificial Intelligence Act: In order to be able to provide this combined conformity assessment, MDR/IVDR Notified Bodies will also have to be designated under the AI ​​Act (after assessment by the Designating Authority in the relevant EU Member State). Not all Notified Bodies may be willing to apply for designation under the AI ​​Act, and even if they do, the process may not be as fast as expected. In practice, this may mean that some manufacturers will have to work with a Notified Body under the MDR/IVDR and another under the AI ​​Act. This would mean two conformity assessments with two different Notified Bodies with the possibility and multiplication of audits for manufacturers.
  • AI Staff: Article 43(3) of the AI ​​Act allows MDR/IVDR notified bodies to oversee AI conformity assessments if they meet certain requirements, such as independence and professional integrity. These requirements have already been assessed for their MDR/IVDR designation and should therefore be easy for them to meet. However, they will also need administrative, technical, legal and scientific staff who have experience and knowledge of the appropriate types of AI systems to conduct conformity assessments. Given that everyone will be looking for AI specialists at the same time (competent authorities, notified bodies, manufacturers), recruitment may prove complicated. Again, if notified bodies are unable to recruit suitable staff quickly, their AI designation may be delayed.
  • Duration of conformity assessment: Currently, assessing the conformity of medical devices under the MDR or IVDR can be a long (and expensive) process (on average 18 months or more). While industry is looking to EU regulators to find solutions to make the current CE marking process more efficient, there are some concerns that the application of the AI ​​Act could add additional burden to the review by notified bodies and negatively impact the timeline required to affix the CE mark to medical devices in the EU. Given the current workload of some MDR/IVDR notified bodies, this potential risk cannot be ignored.

Our recommendations

To prepare for the entry into force of the Artificial Intelligence Act, we recommend that medical device manufacturers consider taking the following necessary steps:

  • Determine the applicability of the Artificial Intelligence Act:Medtech companies should determine whether the AI ​​Act applies to their medical devices (e.g., whether the product falls within the definition of high-risk AI systems or the definition of prohibited AI systems). Companies should also assess the potential regulatory role they will play under the AI ​​Act (supplier, implementer, or distributor/importer) and the related obligations that apply to them.

  • Conduct a gap assessment: Many of the obligations under the AI ​​Act already apply to Medtech companies (e.g., the need to have a quality management system, risk management system, technical documentation). However, some requirements under the AI ​​Act are completely new to the Medtech sector (data management, human oversight, accessibility requirements). A detailed comparison of the requirements of the AI ​​Act with the MDR/IVDR requirements is necessary to identify new requirements and potential gaps to be addressed.

  • Updating internal procedures and technical documentation. As needed, medical technology companies should take steps to review and update their quality management system, technical documentation and post-market surveillance procedures to ensure compliance with the requirements of the AI ​​Act.

  • Make sure your organization has the right staff. If not, consider targeted recruitment of people with AI experience. Staff training will also be important.

  • Access and use reliable data sets in accordance with the Artificial Intelligence Act: Access to these data sets may be required by Notified Bodies when assessing the conformity of AI-enabled medical devices. It is therefore important that Medtech companies planning to train, validate and test AI-enabled medical devices do so now with the data governance requirements of the AI ​​Act in mind.

  • Monitor new eventsHealth technology companies should monitor developments in the European Commission or the new EU Artificial Intelligence Office for guidance on aligning compliance paths between the MDR/IVDR and the AI ​​Act.

Bibliography

1 A safety component is defined as any safety-related part of a product or system, the failure of which could pose a risk to health or safety.