§ AI Act · Article 27
Fundamental Rights Impact Assessment for High-Risk AI Systems
Regulation (EU) 2024/1689
·
Article 27
·
Source
Plain-language summary
Deployers of high-risk AI systems that are public bodies or private entities providing public services must conduct a Fundamental Rights Impact Assessment before deploying such systems. The assessment requires describing the intended use, affected individuals, potential risks, human oversight measures, and mitigation steps. It must be performed prior to first use and updated as needed. The results must be reported to the market surveillance authority, unless an exemption applies. The AI Office will provide a template to standardize this process.
§ What Fontvera found
Documents that cite Article 27
ai_office
EU
Fetched 2026-04
ai_office
EU
Fetched 2026-04
ai_office
EU
Fetched 2026-04
ai_office
EU
Fetched 2026-04
ai_office
EU
Fetched 2026-04
§ Cross-references
Related articles
AI Act enforcement
97 days
until 2026-08-02, when most AI Act provisions begin to apply.