AI Act FAQ

AI in Law Enforcement: Frequently Asked Questions

5 questions answered with specific EU AI Act article references. 98 days until the August 2, 2026 enforcement deadline.

Not sure if your AI system is affected? Take the 5-minute diagnostic.

Can police use AI?

Yes, but with strict limitations. AI for risk assessment, polygraph alternatives, evidence evaluation, and crime prediction is classified as high-risk under Annex III category 6.

What about predictive policing?

AI systems for individual risk assessment (predicting whether a person will commit an offence) are high-risk (Annex III 6(a)). Broad area-based crime prediction tools are not explicitly addressed.

Is real-time facial recognition allowed for police?

Generally prohibited in public spaces (Article 5(1)(h)). Three narrow exceptions: searching for missing persons, preventing imminent terrorist threats, and locating suspects of serious crimes — all requiring judicial authorisation.

What oversight is required?

Human oversight (Article 14), Fundamental Rights Impact Assessment before deployment (Article 27), mandatory registration in the EU database, and incident reporting.

Can courts use AI for sentencing?

AI in justice is high-risk (Annex III category 8). AI cannot make autonomous sentencing decisions. It may assist legal research and analysis but human judges must make final decisions.

Related intelligence

Get the complete AI Act FAQ as a PDF

All questions and answers in one document. Free.

No spam. Unsubscribe anytime.

Pro tier launching June 2026. Browse all briefings