AI-generated or manipulated image, audio, or video content that appreciably resembles existing persons, objects, places, or events and would falsely appear to be authentic (Article 3(60)).
5 questions answered with specific EU AI Act article references. 98 days until the August 2, 2026 enforcement deadline.
Not sure if your AI system is affected? Take the 5-minute diagnostic.AI-generated or manipulated image, audio, or video content that appreciably resembles existing persons, objects, places, or events and would falsely appear to be authentic (Article 3(60)).
Yes. Deployers must disclose that content has been artificially generated or manipulated (Article 50(4)). The disclosure must be clear and visible.
Partially. For 'manifestly artistic, creative, satirical, fictional analogous work,' the disclosure obligation is limited to existence of AI generation without impairing the work's display (Article 50(4)).
AI systems designed to detect deepfakes are not themselves subject to deepfake disclosure obligations. They may be classified based on their own use case.
Failure to label deepfakes violates Article 50(4), which carries fines up to EUR 15 million or 3% of global turnover under Article 99(2).
All questions and answers in one document. Free.
No spam. Unsubscribe anytime.