AI Act Article 17 is the legal basis for the quality management system (QMS) that every provider of a high-risk AI system must run. It does not record a calendar deadline at the obligation-row level, but providers placing high-risk systems on the EU market under the Article 6 high-risk classification regime will need their QMS evidence ready at conformity assessment. The obligated entity is "provider" — not deployer, importer or distributor.
What Article 17 requires
"Providers of high-risk AI systems shall put a quality management system in place that ensures compliance with this Regulation." The QMS is not optional, and it is not a one-off artefact: it is a running governance system that ties together design, development, testing, post-market monitoring and modifications.
Obligation breakdown
Establish the QMS
The lead obligation is to "establish" — the action verb the row records — a QMS that ensures compliance with the Regulation. Compliance is the legal yardstick: a QMS that exists on paper but does not actually deliver compliant systems is not Article 17-compliant.
Document it systematically
"The quality management system shall be documented in a systematic and orderly manner in the form of written policies, procedures and instructions." Oral practice and tribal knowledge do not satisfy the row — written artefacts are required.
Cover regulatory compliance and modifications
"The quality management system shall include a strategy for regulatory compliance, including compliance with conformity assessment procedures and procedures for the management of modifications to the high-risk AI system." Two pieces of work sit here: how the provider proves conformity at first placement, and how it decides whether a model update or pipeline change retriggers conformity assessment.
Design control
"The quality management system shall include techniques, procedures and systematic actions to be used for the design, design control and design verification of the high-risk AI system." This is the design-side discipline: how requirements are translated into a system, how design choices are reviewed, and how the design is verified against those requirements before development concludes.
Development quality assurance
Separately, the QMS must include "techniques, procedures and systematic actions to be used for the development, quality control and quality assurance of the high-risk AI system." Article 17 distinguishes design-side discipline from development-side discipline; both are required.
Examination, test and validation
"The quality management system shall include examination, test and validation procedures to be carried out before, during and after the development of the high-risk AI system, and the frequency with which they have to be carried out." The frequency is a row-level requirement: the provider commits to a cadence, not just to a one-time test.
Technical specifications and standards
Finally, the QMS must include "technical specifications, including standards, to be applied and, where relevant harmonised standards are not applied in full or do not cover all requirements, the means to ensure compliance." Where a provider relies on harmonised standards, the QMS records that. Where it does not — or where standards do not cover the specific requirement — the QMS records the alternative route to compliance.
What this means in practice
Providers cannot lift QMS templates from other regulated industries (medical devices, automotive) without mapping each clause back to AI Act Article 17. The article's design / development / test split is structurally specific enough that conformity assessment bodies will audit against it as a list, not as a generic ISO-style QMS. Legal teams advising providers should expect the QMS to be the document set surfaced first in any post-market market-surveillance request.
Related Fontvera pages
- AI Act for fintech and credit scoring — sector application of the Article 17 QMS for providers of credit-scoring systems.
- AI Act conformity assessment — Article 17 feeds directly into the conformity assessment procedure providers must complete before placing on the market.
- AI Act provider vs deployer obligations — Article 17 sits squarely on the provider side of the divide.