EuroComply
Konto erstellen
Back to blog
Guides 8 min read

EU Compliance Checklist for Tech Startups in 2026

What you need to know: EU Compliance Checklist for Tech Startups in 2026

Launching or scaling in the EU? This checklist covers GDPR, EU AI Act, NIS2, CRA, and DSA obligations by company stage — so you know exactly what to prioritize and when.

Source: EuroComply Editorial (2026-04-14)Reviewed:
EuroComply Team
EU regulatory specialistsContent reviewed against official EUR-Lex texts
EuroComply Editorial Team
0 views

EU regulatory obligations for tech companies are not one-size-fits-all. What applies to your startup depends on three things: what your product does, who you sell to, and how large your company is. Not all regulations apply from day one — and conflating them wastes compliance resources on non-issues while leaving real gaps unaddressed.

This checklist is structured by company stage. Work through the stage that matches where you are now, then read ahead to the next stage.


Pre-Launch / MVP Stage

Your first EU compliance obligation is not a lengthy certification process — it is understanding your personal data flows and putting basic GDPR foundations in place before you handle any EU user data.

Legal entity and representation

  • If you have no EU legal entity, appoint an EU representative under GDPR Art. 27 before you collect data from EU residents. An EU representative is a named legal or natural person in the EU who can be contacted by DPAs on your behalf.
  • If you process special category data (health, biometric, genetic data, religion, political opinions, sexual orientation) at scale, assess whether you need to appoint a Data Protection Officer (Art. 37). Most early-stage startups do not qualify for mandatory DPO appointment, but document the assessment.

Data governance foundations

  • Write a privacy policy that complies with GDPR Arts. 13–14: lawful basis for each processing activity, data subject rights, retention periods, third-party processors named
  • Write a cookie policy covering consent requirements under the ePrivacy Directive — analytics and advertising cookies require opt-in consent; strictly necessary cookies do not
  • Build a Records of Processing Activities (ROPA) document (Art. 30) — a register of what personal data you collect, why, the legal basis, who sees it, and how long you keep it. Start this now, even as a simple spreadsheet
  • Identify and document the lawful basis for each processing activity before you start processing

AI features

  • If your MVP includes any AI feature, identify whether it falls under the EU AI Act's prohibited practices (already in force from August 2025). Social scoring, subliminal manipulation, and emotion recognition in workplaces are banned outright regardless of company size.
  • Implement AI literacy practices for any staff using AI tools (Art. 4, in force since February 2025) — even informally at this stage; document what AI tools you use and brief the team on their limitations.

Post-Launch / Growing Stage (Up to 50 Employees)

Once you have EU users and a functioning product, the compliance surface expands.

GDPR security and incident readiness

  • Complete a GDPR Art. 32 security review — document the technical and organisational measures protecting personal data (access controls, encryption at rest and in transit, backup and recovery, access logs)
  • Build a breach notification procedure: who is notified internally, who notifies the DPA within 72 hours, and who notifies affected data subjects if the breach creates high risk
  • Negotiate Data Processing Agreements (Art. 28) with all processors handling EU personal data on your behalf — this includes cloud providers, analytics tools, customer support platforms, and email service providers

EU AI Act (if you have AI features)

  • Assess each AI feature against Annex III of the EU AI Act to determine if any qualifies as high-risk. The list covers: CV screening, credit scoring, critical infrastructure components, admissions systems, law enforcement tools, migration risk assessment, and electoral influence tools.
  • If no features are high-risk, document this assessment — regulators will ask. Most early-stage B2B SaaS tools do not have high-risk AI systems, but the assessment must be made and recorded.
  • If you have high-risk AI features, begin compliance documentation now: risk management system, data governance for training data, technical documentation (Annex IV), human oversight design. The compliance deadline is August 2, 2026 — lead time is typically 9–18 months.
  • Implement AI literacy training (Art. 4) for all staff using AI systems — a structured briefing or e-learning module is sufficient; document completion.

DSA (if you have a platform with user-generated content)

  • Review basic DSA obligations: terms of service compliant with Art. 14, single point of contact for authorities (Art. 11), transparency report if you moderate content (Art. 15)
  • If users can post content, implement a notice-and-action mechanism for illegal content (Art. 16)

Scale-Up Stage (50–250 Employees)

At this stage, you may be crossing thresholds that bring NIS2, more demanding AI Act obligations, or CRA into scope.

NIS2 (digital infrastructure and services)

  • Assess whether your company qualifies as a covered entity under NIS2. Check NIS2 Annex II: managed service providers, cloud providers, data centres, CDN providers, DNS service providers, domain registrars, online marketplaces, and search engines are covered if you are medium-sized (≥50 employees or >€10M turnover) or large.
  • If NIS2 applies: register with your national NIS authority, implement the 10 Art. 21 security measure categories (risk analysis, incident handling, business continuity, supply chain security, access controls, MFA, cryptography, training, vulnerability handling, HR security), and establish the dual incident notification procedure (24h early warning; 72h incident notification to CSIRT).
  • Ensure management body accountability — NIS2 Art. 20 requires senior management to approve security measures and be personally liable for non-compliance.

EU AI Act (high-risk systems)

  • If you identified high-risk AI systems and have not yet completed compliance: the August 2, 2026 deadline is binding. Complete the conformity assessment, register the system in the EU database (for systems in Annex III areas subject to public authority use or significant third-party risk), and prepare the EU Declaration of Conformity.
  • If you are a GPAI model provider (you develop a general-purpose AI model): GPAI obligations have applied since August 2, 2025. These include technical documentation, copyright policy, and training data summaries. Models with systemic risk face adversarial testing and incident reporting requirements.

CRA (if you sell software or connected products in the EU)

  • Assess whether any product you sell constitutes a product with digital elements under the CRA. Software sold in the EU market — including SaaS with locally processed components, mobile apps, firmware — is in scope.
  • Classify each product (Default, Important Class I/II, Critical) using CRA Annex III and IV.
  • Begin Annex I gap analysis: audit each product against the essential cybersecurity requirements (secure by default, no known exploits at placement, minimal attack surface, access controls, integrity, logging).
  • Build vulnerability handling processes (coordinated disclosure policy, 24h exploited vulnerability reporting to ENISA/CSIRT) — required from September 2026.
  • Plan conformity assessment for Important Class I/II products — notified bodies have limited capacity; start early.

Priority Matrix: Which Regulation Applies When

| Regulation | Pre-Launch/MVP | Growing (≤50 employees) | Scale-Up (50–250 employees) | |------------|---------------|--------------------------|------------------------------| | GDPR | Applies (basics required before first EU user) | Applies (full compliance required) | Applies (full compliance required) | | EU AI Act (prohibited practices) | Applies (from Aug 2025) | Applies | Applies | | EU AI Act (high-risk systems) | Assess only | Assess + document | Comply by Aug 2026 | | NIS2 | Not yet | Not yet (below threshold) | Applies if in covered sector | | CRA | Not yet (pre-market) | Assess if selling products | Comply (reporting from Sep 2026; full from Dec 2027) | | DSA | Basic tier if hosting user content | Basic/platform tier | Platform or VLOP tier depending on user numbers |


Common Mistakes

Assuming GDPR covers everything. GDPR governs personal data. It does not cover operational security of your networks (NIS2), the security properties of your products (CRA), or the safety of your AI systems (EU AI Act). These are separate legal frameworks with separate obligations.

Ignoring the EU AI Act because "we just use APIs." Using a third-party AI API does not make you a provider — but it does make you a deployer. Deployers have real obligations under the AI Act, including human oversight for high-risk systems and AI literacy for staff. The Act applies to how you integrate and use AI, not just to those who build the underlying model.

Missing CRA because software seems non-physical. The CRA explicitly covers software products. Firmware, mobile apps, desktop applications, and software components that process data locally are in scope if they meet the definition of a "product with digital elements." The misconception that CRA only covers hardware is widespread but incorrect.


Start Here: Three Immediate Actions for Any EU Tech Startup

Regardless of stage, these three actions should happen before anything else:

  1. Inventory your data flows — list every category of personal data you collect, map where it flows, identify third-party processors, and document your lawful basis for each processing purpose. This is the foundation for both GDPR compliance and any AI Act assessment.

  2. Document your AI systems — for every AI tool you use or build, record: what it does, what data it processes, who it makes decisions about, and whether any of those decisions produce significant effects on individuals. This inventory is the starting point for EU AI Act classification.

  3. Assign compliance ownership — someone in your organisation needs to own each regulation. For an early-stage startup, this is usually the CTO or a senior engineer. Compliance without an owner produces only paper without substance.


Last updated: April 2026. For informational purposes only — not legal advice.

EC

EuroComply Editorial Team

EU regulatory compliance specialists covering the AI Act, GDPR, NIS2, and related legislation. Content reviewed against official EU regulation texts and enforcement guidance.

For informational purposes only. Consult qualified legal counsel.

Share:

Ready to check compliance?

Start auditing your AI systems and tech stack today.