EU AI Act compliance for SMEs
Plain-English EU AI Act compliance guide for SMEs: scope, deadlines, evidence, Article 4 training, high-risk AI checks, and a 30/60/90-day plan.
Direct answer
SMEs must comply with the EU AI Act when they provide or deploy AI systems affecting people in the EU. Start by inventorying AI use, classifying risk, training staff, collecting vendor evidence, and preparing high-risk controls before the August 2, 2026 deadline.
EU AI Act compliance for SMEs
SMEs must comply with the EU AI Act when they provide or deploy AI systems affecting people in the EU. Start by inventorying AI use, classifying risk, training staff, collecting vendor evidence, and preparing high-risk controls before the August 2, 2026 deadline.
- Inventory each AI system and assign an owner.
- Classify risk under Article 5, Article 6 and Annex III.
- Train relevant staff and retain Article 4 evidence.
- Collect vendor documentation and provider instructions.
- Prepare high-risk controls before August 2, 2026.
| AI literacy | In force since 2025-02-02 |
| High-risk deadline | 2026-08-02 |
| Maximum AI Act fine | EUR 35M or 7% for prohibited practices |
The EU AI Act applies to SMEs that provide or deploy AI systems affecting people in the EU. Most SMEs start as deployers: they must inventory AI use, train staff, classify risk, keep evidence, and meet high-risk obligations where Annex III applies.
Most Annex III high-risk AI obligations apply, including documentation, oversight, logs and risk management.
AI Act SME compliance checklist
Action checklistList every internal and customer-facing AI tool, owner, vendor, purpose, data categories, user group and deployment status.
Articles 3, 4, 26
Separate prohibited, high-risk, limited-risk and minimal-risk use. Pay special attention to Annex III areas such as employment, education, credit, health and essential services.
Articles 5, 6, 50 and Annex III
Assign a human owner, define intended use, keep logs where available, follow provider instructions and record monitoring decisions.
Article 26
Provide AI literacy training to staff who procure, use, supervise or govern AI tools. Retain completion records and training content.
Article 4
Collect provider instructions, risk classification, data information, transparency notices, security controls and incident handling commitments.
Articles 13, 15, 16, 26
For Annex III systems, document human oversight, accuracy monitoring, data governance, incident escalation and fundamental-rights impact assessment triggers.
Articles 9-15, 26, 27, 73
EU AI Act compliance for SMEs: who must comply?
SMEs comply based on their AI role, not company size. A business that only buys AI software can still be a deployer; a business that ships AI features under its own name can become a provider.
| SME role | Trigger | Evidence to keep | Article |
|---|---|---|---|
| Deployer | Your SME uses an AI system under its own authority for staff, customers or operations. | AI inventory entry, intended-use note, owner, provider instructions, oversight owner and monitoring record. | Articles 3, 4, 26 |
| Provider | Your SME develops an AI system or places it on the EU market under its own name or trademark. | Risk-management file, technical documentation, conformity route, instructions for use and post-market monitoring. | Articles 3, 9-18, 72 |
| Importer or distributor | Your SME makes a third-country AI system available in the EU or distributes a provider's system. | Provider identity checks, documentation checks, conformity marking evidence and escalation records. | Articles 23, 24 |
AI inventory
Shows what AI exists, who owns it and whether the business has scope control.
Evidence: System name, vendor, owner, purpose, users, data categories and risk-tier decision.
Article 4 AI literacy record
AI literacy obligations have applied since February 2, 2025.
Evidence: Training content, attendee list, completion date and role-based training notes.
Risk classification rationale
Separates prohibited, high-risk, limited-risk and minimal-risk systems.
Evidence: Article 5 screen, Article 6 and Annex III assessment, and reviewer sign-off.
Vendor documentation pack
Most SMEs rely on external AI providers and need provider instructions.
Evidence: Instructions for use, security notes, transparency notices, data terms and support contacts.
High-risk control file
Annex III systems need controls before the August 2, 2026 deadline.
Evidence: Human oversight, logs, accuracy monitoring, data governance, incident route and FRIA decision.
Key deadlines
| Date | Obligation | Article |
|---|---|---|
| 2025-02-02 | AI literacy and prohibited AI practicesArticle 4 AI literacy obligations and Article 5 prohibited AI practice bans are already in force. | Articles 4, 5 and 113 |
| 2025-08-02 | General-purpose AI model obligationsGPAI provider documentation, policy and transparency obligations started applying. | Articles 53 and 113 |
| 2026-08-02 | High-risk AI obligationsMost Annex III high-risk AI obligations apply, including documentation, oversight, logs and risk management. | Articles 6, 9-15, 26 and 113 |
| 2027-08-02 | Certain legacy and product-safety AI obligationsAdditional obligations apply for some AI systems connected to Annex I product safety regimes and legacy systems. | Article 113 |
30/60/90-day action plan
First 30 days
Inventory AI use and assign owners
Evidence needed: AI register, owner list, vendor list, use-case descriptions
Articles 3, 4, 26
Days 31-60
Classify risk and collect vendor evidence
Evidence needed: Risk-tier decisions, Annex III notes, provider instructions, transparency notices
Articles 5, 6, 13, 16, 26, Annex III
Days 61-90
Close high-risk gaps and document controls
Evidence needed: Human oversight procedure, logs, staff training records, incident process, FRIA decision
Articles 4, 9-15, 26, 27, 73
SME AI Act questions answered
What does EU AI Act compliance for SMEs include?
EU AI Act compliance for SMEs includes an AI inventory, role classification, risk-tier assessment, Article 4 AI literacy evidence, vendor documentation, human oversight and high-risk controls where Annex III applies.
Does a small business using ChatGPT need AI Act compliance?
A small business using ChatGPT or another AI tool may be a deployer under the AI Act if the system is used under its authority. The first obligations are inventory, staff literacy, intended-use controls and evidence retention.
Turn this guide into a tracked action plan
Use AI X-Ray to classify real systems, then import the result into the dashboard as evidence-backed AI register tasks.
Informational only. This page is not legal advice and does not replace a qualified legal review of your AI systems.