§ AI Act · NIS2 · CER SECTOR

AI Act for critical infrastructure

Energy, transport, water, and digital-infrastructure AI is high-risk. NIS2 and the CER Directive sit on top.

Summary

Annex III §2 captures AI used as a safety component in the management and operation of critical digital infrastructure, road traffic, and the supply of water, gas, heating, and electricity. The framing is intentionally narrow: only AI that performs a safety function is in scope under §2 — a billing optimisation AI for a utility is not. But the line is fact-specific: a load-balancing AI on the electricity grid is a safety component; an outage-prediction model used for crew dispatch is borderline.

For most regulated operators of essential services under NIS2, the AI Act is layered on top of NIS2 Article 21 cybersecurity risk-management measures and the CER Directive's physical-resilience obligations. Where they overlap, all three apply simultaneously. The AI Act's serious-incident reporting (Article 73) converges in time and content with NIS2 Article 23 — but the legal regimes are independent and incidents must be reported through both channels until the Commission issues converged guidance.

Conformity assessment route: most §2 systems take the internal Annex VI route, but where the AI is embedded in equipment governed by a sector regulation (Machinery Regulation (EU) 2023/1230, Railway Interoperability Directive), the existing third-party assessment under that sector regime absorbs the AI Act conformity check (Article 43(3)).

Who this applies to
Operators of essential services under NIS2 (energy, transport, water, digital infrastructure), AI vendors supplying safety-related systems, national NIS2 competent authorities, the European AI Office.
Compliance deadline
2 August 2026 — high-risk AI system obligations apply. The Digital Omnibus (Council + Parliament agreed positions, March 2026) may shift this to 2 December 2027 for Annex III systems and 2 August 2028 for Annex I products. Until the amending regulation is published in the Official Journal, plan for 2 August 2026.
§ Key articles

What the law says

Annex III §2
AI as a safety component in the management and operation of critical digital infrastructure, road traffic, supply of water, gas, heating, and electricity.
Article 6(1)
AI as a safety component of a regulated product — covers grid-management AI under existing electricity and machinery directives.
Article 9
Risk management — must integrate with the NIS2 Article 21 cybersecurity risk-management measures.
Article 14
Human oversight — operators must be able to override AI control of physical infrastructure.
Article 73
Serious incident reporting — convergent with NIS2 Article 23 incident reporting and CER Article 15 disruption reporting.
NIS2 Article 21
Cybersecurity risk-management measures — directly applicable to AI systems used by essential entities.
CER Article 13
Resilience-enhancing measures for designated critical entities.
§ Detail

In depth

What "critical infrastructure" means under §2

Annex III §2 captures three sub-categories of safety-related AI:

Provider obligations

Operator (deployer) obligations

Where AI Act, NIS2, and CER overlap

The three regimes do not conflict in substance, but the operator is subject to all three simultaneously. In practice:

Enforcement landscape

Sector regulators are likely to be the AI Act market surveillance authority — for example BNetzA (Germany) and CRE (France) for energy; ART and ANSF for rail in France; ACM in the Netherlands. NIS2 competent authorities (BSI, ANSSI, NCSC-NL) will coordinate. Cross-border critical infrastructure cases will involve the European AI Office and ENISA.

§ Action items

Practical steps

01
Inventory all AI in production control loops; classify each as §2 safety-related, §2 non-safety, or out of scope.
02
For embedded AI in regulated products: confirm with the existing notified body that the integrated AI Act + sectoral conformity assessment is in scope of their designation.
03
Build a unified incident-reporting playbook covering AI Act Article 73, NIS2 Article 23, and (where applicable) CER Article 15 — with the 24h NIS2 clock as the master.
04
Align the NIS2 Article 21(2) cybersecurity controls with Article 15 robustness measures; document the mapping in the technical file.
05
Run a tabletop exercise on AI failure during a peak-demand event; verify the Article 14 human-override path actually works.
§ What Fontvera found

Documents in our corpus

ai_office EU Fetched 2026-04
eiopa EU Fetched 2026-04
Opinion on Artificial Intelligence governance and risk management
eurlex EU Fetched 2026-04
EUR-Lex: 32025R0454 (2025-03-07)
ai_office EU Fetched 2026-04
ai_office EU Fetched 2026-04
§ Cross-references

Related Fontvera intelligence

Need a cross-border briefing on this?
Search Fontvera ↵ Run the AI Act diagnostic
AI Act enforcement
97 days
until 2026-08-02, when most AI Act provisions begin to apply.