Obligations Overview

A concise overview of who must comply with what, and under which articles.

High-Risk

High-Risk AI Systems

ObligationWho Must ComplyArticlesDeadline
Register high-risk AI system in EU databaseProviderBefore placing on market
Conduct conformity assessment and draw up EU declaration of conformityProviderBefore placing on market
Establish quality management system (QMS)ProviderBefore placing on market
Create and maintain technical documentationProviderOngoing
Implement post-market monitoring systemProviderAfter placing on market
Conduct fundamental rights impact assessment (FRIA)Deployer (public bodies and certain private operators)Before deployment
Implement human oversight measuresDeployerDuring deployment
Inform workers/their representatives about AI useDeployerBefore deployment
Notify serious incidents and malfunctions to authoritiesProviderWithin 15 days
Transparency Req.

Transparency Obligations

ObligationWho Must ComplyArticlesDeadline
Inform users they are interacting with an AI systemProvider / DeployerAt first interaction
Mark AI-generated synthetic content (watermarking/labelling)ProviderOngoing
Disclose deepfakes to affected personsDeployerAt time of use
GPAI

General-Purpose AI Models

ObligationWho Must ComplyArticlesDeadline
Maintain technical documentation (Annex XI)GPAI ProviderOngoing
Publish training data summaryGPAI ProviderBefore making model available
Implement copyright compliance policyGPAI ProviderOngoing
Conduct adversarial testing (red-teaming) for systemic risksGPAI Provider (systemic risk)Before and after release
Report serious incidents to AI OfficeGPAI Provider (systemic risk)Without undue delay
Ensure adequate cybersecurity for model and infrastructureGPAI Provider (systemic risk)Ongoing