EuroComply
Sign up
Back to blog
EU AI Act 7 min read

EU AI Act Conformity Assessment: A Practical Checklist

What you need to know: EU AI Act Conformity Assessment: A Practical Checklist

High-risk AI systems must pass a conformity assessment before deployment. The August 2026 deadline is approaching. This checklist walks through every requirement so you know exactly what to prepare.

Source: EuroComply Editorial (2026-04-14)Reviewed:
EuroComply Team
EU regulatory specialistsContent reviewed against official EUR-Lex texts
EuroComply Editorial Team
0 views

If you are a provider of a high-risk AI system, you cannot deploy it until it has passed a conformity assessment. This is not a post-deployment audit β€” it is a prerequisite for placing the system on the market. The deadline for high-risk AI systems listed in Annex III is August 2, 2026.

This guide explains who must complete a conformity assessment, what it involves, and what to prepare.

Who Needs a Conformity Assessment

The obligation falls on providers β€” companies that develop high-risk AI systems and place them on the EU market or put them into service. Providers include:

  • Companies that build high-risk AI systems for their own deployment
  • Companies that sell or license high-risk AI systems to other organisations
  • Importers and distributors who place a non-EU provider's high-risk AI system on the EU market

Deployers β€” organisations that use a high-risk AI system built by someone else β€” generally do not conduct conformity assessments. Their obligations are different: registering the system, conducting fundamental rights impact assessments, and implementing human oversight. But if you are buying a high-risk AI system from a provider, you should verify that the provider has completed a conformity assessment before you deploy it.

A system is high-risk if it is listed in Annex III of the AI Act. The eight sectors are: biometrics, critical infrastructure, education, employment, access to essential services, law enforcement, migration and border management, and administration of justice and democratic processes.

Self-Assessment vs Third-Party Assessment

The route depends on the product category:

Self-assessment (internal conformity check) β€” available for most Annex III high-risk AI systems. The provider conducts the assessment internally against the requirements and issues an EU Declaration of Conformity.

Third-party notified body assessment β€” required for AI systems embedded in or constituting a safety component of products already subject to EU harmonisation legislation listed in Annex I β€” including medical devices, machinery, radio equipment, civil aviation, and motor vehicles. These products require assessment by an accredited notified body.

If your AI system is a standalone software application (HR screening tool, fraud detection model, credit scoring engine), self-assessment is likely the correct route. If it is embedded in regulated hardware, verify whether a notified body is required.

The 8 Requirements (Articles 9–15)

A conformity assessment verifies that the system meets all of these:

1. Risk management system (Article 9) β€” A documented risk management process that runs throughout the system's lifecycle: identification of known and foreseeable risks, risk estimation and evaluation, and risk mitigation. The process must be iterative and updated as the system evolves.

2. Data and data governance (Article 10) β€” Training, validation, and testing datasets must be subject to data governance practices: relevance, representativeness, freedom from errors, completeness. Biases must be identified and addressed. Data used for training must be documented.

3. Technical documentation (Article 11 and Annex IV) β€” Comprehensive documentation that allows competent authorities to assess compliance. See the detailed checklist below.

4. Record-keeping and automatic logging (Article 12) β€” The system must automatically log events throughout its operation to the extent technically feasible. Logs must enable post-market monitoring and investigation of incidents.

5. Transparency and information to deployers (Article 13) β€” Providers must supply deployers with an instructions for use document covering: the system's identity and purpose; the level of accuracy and performance; limitations; the types of input data; any known biases; expected lifetime and maintenance needs; and human oversight measures.

6. Human oversight (Article 14) β€” The system must be designed to allow humans to monitor, understand, and intervene. This includes: the ability to interpret outputs; the ability to override or stop the system; awareness of automation bias risk; and appropriate human oversight measures built into the system design.

7. Accuracy, robustness, and cybersecurity (Article 15) β€” The system must be designed and developed to achieve appropriate levels of accuracy and to behave consistently. Resilience against errors, faults, and adversarial attacks must be tested and documented. Cybersecurity measures must be proportionate to the risk.

Technical Documentation Checklist (Annex IV)

Annex IV lists 14 elements that technical documentation must cover. Work through each before your assessment:

| # | Element | |---|---------| | 1 | General description: intended purpose, interactions with hardware/software, versions | | 2 | Detailed description of components: algorithms, logic, key design choices, limitations | | 3 | Description of the monitoring, functioning, and control of the system | | 4 | Description of the human oversight measures | | 5 | Technical specifications: computing power, key performance metrics | | 6 | Training methodology: approaches, techniques, tools, data used | | 7 | Validation and testing procedures: protocols, methodologies, results | | 8 | Cybersecurity measures | | 9 | Description of relevant changes made during lifecycle | | 10 | List of harmonised standards applied; where not applied, description of solutions | | 11 | Copy of EU Declaration of Conformity | | 12 | Post-market monitoring plan | | 13 | Summary of the risk management system | | 14 | Human oversight measures and their technical implementation |

EU Declaration of Conformity (Article 47)

After completing the conformity assessment, the provider must draw up an EU Declaration of Conformity. This document must state:

  • The name and address of the provider
  • The AI system's identification (name, type, version number)
  • That the AI system complies with this Regulation and any other applicable EU law
  • The conformity assessment procedure applied (Annex VI for self-assessment or Annex VII for notified body)
  • The notified body's name, identification number, and certificate number (if applicable)
  • Place and date of issue, name and signature of the authorised person

The Declaration of Conformity must be kept for 10 years after the system is placed on the market or put into service.

CE Marking

For AI systems that are safety components of Annex I products (medical devices, machinery, etc.), the CE mark must be affixed after a successful conformity assessment. CE marking signals conformity with applicable EU law. For standalone software AI systems not embedded in Annex I products, CE marking is not required.

Registration (Article 49)

Before deploying a high-risk AI system, providers must register it in the EU database for high-risk AI systems maintained by the European Commission. Registration is required before placing on the market or putting into service. The database is publicly accessible.

Registration requires: provider identity, system name and version, intended purpose, countries of deployment, status (on market / withdrawn / recalled), Declaration of Conformity reference, instructions for use, contact for post-market monitoring.

Timeline and Practical Preparation

The high-risk AI compliance deadline is August 2, 2026. Assessment must be completed before deployment, not by the deadline β€” the deadline is when systems must already be compliant.

10-Item Preparation Checklist (start 6 months before deadline)

  1. Confirm your system falls under Annex III and identify the specific sector and use case.
  2. Determine whether self-assessment or a notified body is required.
  3. Appoint a compliance lead responsible for the assessment process.
  4. Conduct a gap analysis against Articles 9–15 β€” document current state for each requirement.
  5. Build or update the risk management system and document the process.
  6. Audit training and validation data for representativeness, bias, and completeness.
  7. Prepare all 14 elements of Annex IV technical documentation.
  8. Implement and test human oversight mechanisms in the system design.
  9. Run accuracy, robustness, and adversarial testing; document results.
  10. Draw up the EU Declaration of Conformity and register in the EU database.

Last updated: April 2026. For informational purposes only β€” not legal advice.

EC

EuroComply Editorial Team

EU regulatory compliance specialists covering the AI Act, GDPR, NIS2, and related legislation. Content reviewed against official EU regulation texts and enforcement guidance.

For informational purposes only. Consult qualified legal counsel.

Share:

Ready to check compliance?

Start auditing your AI systems and tech stack today.