GDPR, NIS2, the AI Act, MiCA: which EU rules apply to your software?
Four EU regimes can reach a software product, and most teams aren't sure which. This is the plain-English map — a "does this apply to me" test for each, with links to the official texts.
Ask a software team "which EU regulations apply to your product" and the honest answer is usually a shrug. Four regimes can reach a modern software product — the GDPR, NIS2, the EU AI Act and MiCA — and they reach by different triggers, so the same company can be fully covered by one and entirely outside another. This guide is the map: a plain-English "does this apply to you" test for each, how they overlap, and a link to every official text so you can go to the source.
This is an engineering-side orientation map, not legal advice. The tests below are the plain-English version; the precise scope of each regime has edge cases, and a borderline classification has real legal consequences. Use this to know which regimes to take seriously, then confirm the specifics with a qualified lawyer.
GDPR — if you touch personal data
The General Data Protection Regulation is the oldest and broadest of the four. The trigger is simple: if you process the personal data of people in the EU, it applies — regardless of where your company is. "Personal data" is wide (a name, an email, an IP address, a user ID) and "process" is wider still (storing, displaying, analysing). For practical purposes, almost every software product with EU users is in scope.
What it asks for: a lawful basis for processing, transparency with users, honoring data-subject rights (access, deletion, portability), a data-processing agreement with anyone who processes data on your behalf, security appropriate to the risk, and a DPIA for higher-risk processing. The full text is the official source: the GDPR on EUR-Lex.
NIS2 — if you're an important or essential entity
NIS2 is the EU's cybersecurity directive, and unlike GDPR it does not catch everyone. It applies to organisations that are classified as 'essential' or 'important' entities — a status determined by your sector and your size. The sectors are listed (they include digital infrastructure, ICT service management, and a range of others), and the size thresholds generally bring in medium and large organisations rather than the smallest.
If you are in scope, NIS2 asks for cybersecurity risk-management measures, governance accountability, and incident reporting on a defined timeline. Because it's a directive, it takes effect through each member state's national transposition, so the exact local detail varies. Run the test honestly — sector plus size — and if you plausibly qualify, treat it as in scope. The official source: the NIS2 Directive on EUR-Lex.
The EU AI Act — if you build or deploy AI
The EU AI Act applies if you provide or deploy an AI system, and it is phasing in on a staggered timeline. It does not regulate "AI" as one thing — it sorts AI systems into risk tiers (prohibited, high-risk, limited-risk with transparency obligations, and minimal-risk) and scales the obligations accordingly. Most B2B software with an AI feature lands in the limited-risk tier, where the duty is disclosure: tell people they're interacting with an AI, and label AI-generated content.
Like the GDPR, it reaches non-EU companies whose AI output is used in the EU. We covered the classification process in depth in our EU AI Act practical guide; the official source is the AI Act on EUR-Lex.
MiCA — if you touch crypto-assets
MiCA — the Markets in Crypto-Assets Regulation — is the narrowest of the four by trigger: it applies if your product issues, trades, or provides services around crypto-assets. If you have nothing to do with crypto-assets, MiCA does not touch you. If you do, the treatment depends on what the asset actually is — a payment-like token, an asset-referenced token, and other categories are handled differently, and a token that functions purely as a voucher or a loyalty point may fall outside the crypto-asset definition entirely.
This is the regime where the "confirm with counsel" advice is sharpest: the classification of a specific token is a legal determination with significant consequences. If you're building anything token-shaped, get the classification confirmed before you design around an assumption. The official source: MiCA on EUR-Lex.
How the regimes overlap
These are not mutually exclusive — they stack. A fintech SaaS that handles customer data (GDPR), is large enough in a covered sector (NIS2), ships an AI-powered feature (AI Act), and offers a crypto-asset product (MiCA) is genuinely subject to all four at once. More common: a B2B SaaS is firmly under GDPR, plausibly under NIS2 once it grows, under the AI Act's transparency tier the moment it adds an AI feature, and entirely outside MiCA. The point is that "are we compliant?" is not one question — it is four questions, and they have four different answers.
What to do: a triage
Don't try to "become compliant" as one project. Triage instead — run each test, per product, and write the answer down.
GDPR · do you process personal data of people in the EU? Almost certainly yes — treat it as in scope and move on to the specifics.
NIS2 · are you in a covered sector, above the size thresholds? If plausibly yes, treat it as in scope until confirmed otherwise.
AI Act · do you provide or deploy an AI system? If yes, classify it by use-case risk tier — most software is limited-risk.
MiCA · do you issue, trade or service crypto-assets? If no, you're out. If yes, get the asset's classification confirmed with counsel before designing.
Write it down · a short, dated note per product recording each answer and the reasoning. That document is what an enterprise customer's procurement team, an investor's due diligence, or a regulator will ask for first.
How DField Solutions handles compliance
We run this triage at the start of every engagement, because the regulatory layer shapes architecture and is far cheaper designed in than retrofitted. Every project ships with the documentation its regimes call for — GDPR data-processing agreements and a processing inventory, an AI Act risk classification when AI is in scope, a NIS2 readiness pack where the entity qualifies — produced alongside the code by the people who wrote it. We are engineers, not lawyers: we build the evidence trail and the technical controls so that whatever your counsel confirms, the artefacts already exist and are accurate.
If you want the compliance layer scoped into a build rather than chased afterwards, the services overview covers what we do and a 30-minute discovery call is the place to run the triage against your actual product. The glossary has plain-language entries on GDPR, NIS2, the AI Act, MiCA and the terms around them.