Intelligence Briefing: Authorized Representative Under the EU AI Act – Requirements for Non-EU Providers
1. What the Regulation Requires and Who It Applies To
Under the EU AI Act (Regulation (EU) 2024/1689), non-EU providers of AI systems must appoint an Authorized Representative (AR) if they place high-risk AI systems or GPAI models on the EU market (Article 22). The AR acts as a legal contact point for EU authorities, ensuring compliance with the Act’s obligations, including risk management, transparency, and post-market monitoring (Articles 8–15, 50, 52).Key obligations for non-EU providers:
- High-risk AI systems (Articles 8–15): Must comply with strict requirements, including risk management, data governance, technical documentation, and post-market monitoring. The AR ensures these are met.
- General-Purpose AI (GPAI) models (AI Act: GPAI obligations): Providers must register in the EU database (Article 51) and ensure compliance with transparency and systemic risk obligations (if applicable).
- Transparency obligations (Articles 50, 52): The AR must facilitate compliance with labeling, user information, and disclosure requirements.
2. Enforcement Precedents
As of the compliance deadlines outlined in the AI Act Implementation Timeline, enforcement actions for non-compliance with AR requirements are not yet documented in the provided sources. The EU AI Act mandates that enforcement will follow after the compliance deadlines (e.g., August 2025 for high-risk systems, August 2026 for GPAI). Until then, non-EU providers should prepare for potential audits by national authorities.Note: The provided enforcement cases (e.g., GDPR fines in France, Germany, Luxembourg) are unrelated to the AI Act and do not apply to AR requirements.
3. Practical Compliance Steps
Non-EU providers should take the following steps to ensure compliance with the AR requirement:- Appoint an AR before market placement: The AR must be established in an EU Member State and have legal authority to act on behalf of the provider (Article 22(3)).
- Ensure the AR has access to technical documentation: The AR must review risk assessments, data governance policies, and post-market monitoring plans for high-risk AI systems (Articles 9–15).
- Register GPAI models in the EU database: Providers must submit compliance documentation via the AR (AI Act: GPAI obligations).
- Implement transparency measures: The AR must ensure labeling, user information, and disclosure requirements are met (Articles 50, 52).
- Monitor regulatory updates: The AI Act Implementation Timeline outlines key deadlines (e.g., August 2025 for high-risk systems), requiring providers to align compliance efforts accordingly.
4. Cross-Border Differences
While the EU AI Act is directly applicable across all Member States, enforcement may vary due to:- National authority interpretations: Some Member States may prioritize certain high-risk AI categories (e.g., biometric systems) over others.
- GPAI compliance variations: Providers of GPAI models must monitor systemic risk obligations, which may be enforced differently by national regulators.
- AR selection criteria: Some Member States may impose additional requirements on ARs (e.g., local establishment or industry-specific expertise).
Conclusion: Non-EU providers must appoint an AR to ensure compliance with the EU AI Act’s obligations for high-risk AI systems and GPAI models. While enforcement actions are pending, providers should act now to meet the August 2025/2026 deadlines.