NEU!! Entwickelt SxMDs mit einem strukturierten eQMS, einschließlich auditfähriger SxMD-Vorlagen, die an EU- und US-Standards angepasst sind. Mehr Erfahren!
Technical Documentation under the AI Act: Must read guide
In this blogpost we aim to outline the requirements of the AI Act with regards to Technical Documentation and investigate the relation between the AI Act and other legislation, such as those outlined in Annex II Section A.
With the leaking of the final text an interesting change regarding Technical Documentation requirements revealed itself. During the trilogue sessions there has been emphasis on the implications of the regulation for smaller organisations (SME’s & Micro Enterprises). Noting that the European Commission concluded in an earlier research (2014) that ‘rigid regulation can hamper innovation’. Consequently, one of the outcomes of the trilogue includes the option to prepare a simplified version of Technical Documentation specifically for smaller organisations. How this will work out exactly in practice remains to be seen, for example, how the Technical Documentation is practically simplified (e.g. consider how to embed a risk analysis within a simplified form), and how this will interact with Technical Documentation requirements set out in other regulations, such as those set out in Annex II Section A. Learn more about the AI Act with Medical Devices in our video guide.
What does Technical Documentation mean?
In 2008 the European Union adopted the New Legislative Framework or ‘NLF’ (2008/768) which provides a toolbox of measures that can be used by the European Union in product legislation. One of those measures is the need for organizations to implement ‘Technical Documentation’. The New Legislative Framework further indicates ‘modules’ that can be applied for conformity assessment, and how these modules embed Technical Documentation as part of conformity assessment. The current leaked version of the AI Act allows for organisations developing High-Risk AI Systems to apply the modules explained in Article 43 of the AI Act (for High-Risk devices per Annex III and Article 6), or require the following of the modules explained in applicable NLF legislation (per Annex II Section A).
In the basis, Technical Documentation provides the relevant information for allowing third parties (regulators and economic operators such as importers, distributors and authorized representatives) to assess compliance of a product against legislative requirements. What information is required may differ from one NLF regulation to another.
Within the AI Act, the European institutions have decided that Technical Documentation needs to be available for all ‘High-Risk’ AI Systems.
Article 11 of the AI Act
As clarified already in the previous section, the AI Act requires organizations to demonstrate compliance for High-Risk AI systems by drawing up Technical Documentation per Annex IV. Such Technical Documentation will need to be stored for at least ten years after the AI system has been placed on the market (consider this to apply to the last date of the AI system being available on the market).
For organizations whose AI systems are governed under existing NLF legislation (Annex II, Section A), they should take note of recital 42 of the AI Act. This recital clarifies that these organizations ‘will have the flexibility on operational decisions as to how to ensure compliance of a product that contains one or more AI systems with the applicable requirements in a best way’. More simply put, they will be able to add the relevant extra Technical Documentation requirements into their existing product Technical Documentation.
For SME’s and Micro Enterprises, the implications of the Simplified Technical Documentation introduced in the final text of the AI Act remains to be seen. The AI Act sets out the requirement for the European Commission to develop a specific form that will need to be applied in using ‘Simplified Technical Documentation’. A development to monitor if an organisation qualifies as a SME or micro enterprise.
A final interesting note regarding Article 11 is 11.3, which provides the freedom for the European Commission to adopt further acts that add Technical Documentation requirements for AI systems as deemed necessary. As such, additional Technical Documentation requirements may be added over time.
Annex IV Technical Documentation
Annex IV of the AI Act provides for the full specification of the Technical Documentation required to demonstrate compliance with the AI Act. Annex IV is divided into 9 sections, where an overview of the sections is provided below.
This set of Technical Documentation is further extended into subsections (e.g. a, b, c etc), for example, the General Description requires the documentation of the intended use, interactions with other systems, versioning, associated hardware, user interface, and instruction for use. For devices already covered by NLF legislation under Annex II Section A, the information for the General Description is most likely already available.
Detailed Description of Technical Documentation under the AI Act
The Detailed Description goes into far greater detail of the internal functioning of the AI system and how it was developed. For example, it requires organisations to detail the design specifications of the AI system, including the general logic of the algorithm and documented rationales and assumptions that were made during the design (e.g. the main classification choices, what parameters it's optimized for, descriptions of outputs etc.). Organisations can decide to document such considerations as part of Software System Requirements specifically for the AI System, e.g. using a Requirements Management System.
These requirements will demand that organisations need to start documenting decisions during the design and development process, and not in retrospect. Organisations should make sure to define the design and development process as part of the Quality Management System aligned with the AI system lifecycle. For example, when training an AI model for researching its effectiveness, prior to even developing a product around it, there are already implicit decisions made with regards to classification choice and optimizing the parameters. It might be complex to recall what considerations were applicable if inappropriately documented at the time, for example the responsible data scientists may no longer be with the company.
Similarly organisations will need to document information regarding the data used for the development of the AI system. The documentation should include the outputs referred to in our earlier blogpost on Data Governance, and as required by the organisation’s Data Management Procedures.
Another item that is worth mentioning as part of the Detailed Description are the pre-determined changes to the AI System and its performance. This specifies that organisations will be capable under the AI Act that implementing changes to the system’s performance in the post-market setting is possible where such changes were approved during the premarket review.
Pre-Market Monitoring under the AI Act
Pre-market monitoring refers to the information to be documented about the AI System which will support the monitoring, functioning (including capabilities and limitations) and control of the AI System, and in specific with regards to the degrees of accuracy for specific persons or groups of persons on which the system is intended to be used.
During the pre-market stages, organisations will further need to detail how human oversight can be established in accordance with Article 14 of the AI Act. The AI Act further requires that organizations explain the metrics which are used to measure the performance of the AI System and why those metrics are considered acceptable.
Similar to the previous section (Detailed Description) it is of importance to document the appropriateness of the metrics early on in the development process (e.g. as input requirements), since the metrics need to align with the intended purpose for which the AI System is being developed, and you will want to ensure that the AI system can be tested (e.g. in validation tests) against the chosen metrics.
Risk Management under the AI Act
We have extensively discussed the implications of the AI Act in our earlier blogpost on risk management, and for details refer back to the blogpost. Organisations will need to document their risk management activities, including the planning of the risk management process (e.g. in the form of a risk management plan), the execution (in the form of an extensive assessment, e.g. using a requirements management system including a risk management module) and should draw conclusions with regards the acceptability of risks, including considerations with regards to potential adverse impacts on persons under the age of 18, and as appropriate, vulnerable groups of people.
Changes to the AI System within the AI Act
Similar to what has been written before, organisations will need to document changes that are made to the AI system throughout its defined lifecycle, again as noted earlier, documentation of requirements and changes thereto can be supported by software tools.
Organisations should further document procedures as part of the Quality Management System to assess changes, document changes and determine whether changes may need to be notified towards regulatory authorities in line with the AI Act (e.g. per Annex VII item 3.4 or 4.7, or in line with the conformity assessment procedures executed under NLF legislation noted in Annex II Section A).
Harmonised Standards supporting the AI Act
As noted in earlier blog posts, the European Commission will, as soon as the final text has been voted for positively, issue a final Standardisation Request to the standardisation bodies, such as CEN/CENELEC (find the already issued draft here). The request will outline the standards which should be supportive of the AI Act.
It is important to monitor these activities since the standards to be harmonised will need to be complied with and referenced from the Technical Documentation, explaining whether these were applied in full or in part, or alternatively an explanation of why such standards were not applied.
Declaration of Conformity
As for any other European product legislation per the NLF, organisations responsible for the development of the product and bringing it to the market, they will need to document a Declaration of Conformity, attesting that their products have been developed in line with the legislation.
The Declaration of Conformity may be combined with other NLF legislation, avoiding several separate declarations. A newly introduced requirement however is that the AI Act also requires organisations that develop AI systems to declare conformance with the GDPR 2016/679, 2018/1725 and potentially 2016/680 (re: criminal offenses) if their AI systems process personal data.
Post Market Monitoring under the AI Act
Last but not least, the organisation should define their systems to monitor the AI system performance in the post market phase. The Technical Documentation demands the availability of the Post Market Monitoring Plan which defines (per Article 61) how the organisation actively and systematically collects, documents and analyses relevant data throughout the lifetime of the AI System. Again this may be incorporated into existing Post Market Monitoring Plans required under NLF legislation covered by Annex II Section A (e.g. a Post Market Surveillance plan for medical devices and in-vitro medical devices as required by 2017/745 and 2017/746).
Medical Devices (per 2017/745) and In-Vitro Medical Devices (per 2017/746)
For manufacturers of medical devices and in-vitro medical devices, the AI Act clearly specifies that Technical Documentation can be combined with the existing Technical Documentation on the Medical Devices. There is no need for preparing a separate set of Technical Documentation, however, it is not mandatory to combine the sets either.
An interesting note to mention is that Manufacturers of those devices will need to appoint a Person Responsible for Regulatory Compliance under the respective Medical Device Regulations. This Person is responsible to ensure that under 3(b) of Article 15, the Technical Documentation and Declaration of Conformity are drawn up and kept up-to-date. As such their responsibility with the introduction of the AI Act is implicitly extended to cover the items required by the AI Act.