§ AI Act TOPICAL

AI Act provider vs deployer obligations

Two roles, two regimes, one shared obligation: keep the system safe in the real world. When the deployer becomes a provider — and what changes.

Summary

Provider and deployer are the two principal roles in the AI Act. The provider develops the AI system and places it on the market; the deployer uses the system in operation. Article 16 sets the provider's obligations; Article 26 sets the deployer's. The two roles are complementary — neither alone makes the system safe.

The provider carries the heavy compliance lift: conformity assessment, technical documentation, data governance, instructions for use, post-market monitoring. The deployer's obligations are operational: use the system as intended, ensure human oversight, monitor and log operation, report serious incidents, inform workers, and (in specific cases) complete a fundamental rights impact assessment.

Article 25 is the boundary article. A deployer becomes a provider — with the full Article 16 burden — if it puts the system on the market under its own name, modifies the system substantially, or modifies the intended purpose of a system in a way that brings it into a high-risk category.

Who this applies to
AI vendors (typical providers), enterprises buying or operating AI (typical deployers), procurement and legal teams contracting both sides.
Compliance deadline
2 August 2026 — high-risk AI system obligations apply. The Digital Omnibus (Council + Parliament agreed positions, March 2026) may shift this to 2 December 2027 for Annex III systems and 2 August 2028 for Annex I products. Until the amending regulation is published in the Official Journal, plan for 2 August 2026.
§ Key articles

What the law says

Article 3(3)
Definition of provider — develops or has developed and places on the market or puts into service under its own name.
Article 3(4)
Definition of deployer — uses an AI system under its authority (except in personal non-professional capacity).
Article 16
Obligations of providers of high-risk AI systems.
Article 25
When a distributor, importer, deployer, or other third party becomes a provider.
Article 26
Obligations of deployers of high-risk AI systems.
Article 27
Fundamental rights impact assessment for deployers (public bodies + private deployers in §5(b), §5(c), §1).
Article 50
Transparency obligations — partially provider, partially deployer.
§ Detail

In depth

Provider obligations (Article 16)

The provider:

Deployer obligations (Article 26)

The deployer:

Article 27 — who must do a FRIA

The FRIA covers the intended purpose, the categories of natural persons affected, the foreseeable impact on fundamental rights (including non-discrimination), the bias risks, the human-oversight design, and the mitigations. It is filed with the market surveillance authority before first use.

Article 25 — when the deployer becomes a provider

Article 25 captures four situations where a downstream actor inherits the full Article 16 provider regime:

  1. The actor puts its name or trademark on a high-risk system already placed on the market.
  2. The actor makes a substantial modification to a high-risk system already on the market in a way that does not change its high-risk classification.
  3. The actor modifies the intended purpose of an existing AI system (including a non-high-risk system or a GPAI model) in a way that brings the resulting system into a high-risk category.
  4. The actor takes a GPAI model and integrates it into a high-risk AI system in a way that the GPAI provider would not have foreseen.

In each case, the original provider's obligations do not disappear — but the new actor inherits them in addition. Practically, an enterprise that fine-tunes a GPAI model and deploys it as a high-risk hiring tool becomes a provider of that derived system.

Where the responsibility transfers in practice

TopicProviderDeployer
Conformity assessmentYes (Art 43).No (unless becomes a provider under Art 25).
Technical documentationYes (Art 11).No.
Risk management lifecycleYes, system-level (Art 9).Operational risk in deployment context.
Data governanceYes, training and validation (Art 10).Yes, input data within the deployer's control (Art 26(4)).
Human oversightDesigns for it (Art 14).Operates it (Art 26(2)).
LoggingDesigns the logging (Art 12).Keeps logs at least 6 months (Art 26(6)).
Worker informationYes, before deployment (Art 26(7)).
FRIAYes, public bodies + §1/§5(b)/§5(c) private (Art 27).
Serious-incident reportingTo market surveillance authority.To provider and market surveillance authority.
EU Database registrationSystem registration (Art 49(1)).Annex III deployer registration where applicable (Art 49(3)–(4)).

Procurement implications

Procurement contracts should explicitly allocate the risk of becoming a provider under Article 25. A contract that lets the deployer fine-tune the model on its own data, repoint the system to new use cases, or rebrand the system can shift Article 16 burden onto the deployer. The conservative procurement clause keeps the original provider in the provider role for the contract scope and ring-fences any deployer modifications.

§ Action items

Practical steps

01
Map every AI relationship to a clear provider/deployer role before procurement; document in the contract.
02
Build deployer-side compliance against Article 26 — instructions follow-through, oversight roles, logging, worker information.
03
For Annex III §1, §5(b), §5(c) and public-body deployments: complete the Article 27 FRIA before first use.
04
In procurement contracts, explicitly disclaim Article 25 provider-trigger conditions (no rebranding, no substantial modification, no out-of-scope repointing).
05
Brief in-house teams that fine-tuning a GPAI model into a high-risk system makes them a provider under Article 25(c).
§ What Fontvera found

Documents in our corpus

ai_office EU Fetched 2026-04
eiopa EU Fetched 2026-04
Opinion on Artificial Intelligence governance and risk management
eurlex EU Fetched 2026-04
EUR-Lex: 32025R0454 (2025-03-07)
ai_office EU Fetched 2026-04
ai_office EU Fetched 2026-04
§ Cross-references

Related Fontvera intelligence

Need a cross-border briefing on this?
Search Fontvera ↵ Run the AI Act diagnostic
AI Act enforcement
97 days
until 2026-08-02, when most AI Act provisions begin to apply.