General-purpose AI provider obligations
The EU AI Act treats general-purpose AI models as a distinct regulatory cluster. All GPAI providers face the Article 53 baseline (technical documentation, copyright policy, training-data summary). Models crossing the Article 51(2) 10^25 cumulative-FLOPs threshold are presumed to pose systemic risk and pick up the Article 55 add-on obligations. GPAI provisions have been enforceable since 2 August 2025; sanctions are imposed by the European Commission directly under Article 101.
What counts as a general-purpose AI model
Per Article 3(63), a general-purpose AI model is a model — including one trained with a large amount of data using self-supervision at scale — that displays significant generality and is capable of competently performing a wide range of distinct tasks regardless of the way the model is placed on the market, and that can be integrated into a variety of downstream systems or applications, except AI models that are used for research, development or prototyping activities before they are placed on the market.
Frontier large language models, frontier vision-language models, and similarly broadly-capable models fall within the GPAI category regardless of distribution mode (open-weights, hosted API, embedded in a product). Models released under a free and open-source licence have specific carve-outs under Article 53(2) for some — but not all — obligations.
Four obligations apply to every GPAI provider
Art. 53(1)(a)
Annex XI technical documentation
Draw up and keep up to date the technical documentation of the model, including its training and testing process and the results of its evaluation, containing at minimum the information set out in Annex XI — for provision to the AI Office and national competent authorities upon request.
Art. 53(1)(b)
Information for downstream providers
Draw up, keep up to date, and make available information and documentation to providers of AI systems that intend to integrate the GPAI model into their AI systems — covering capabilities, limitations, and the elements set out in Annex XII.
Art. 53(1)(c)
Copyright policy
Put in place a policy to comply with Union law on copyright and related rights, in particular to identify and comply — including through state-of-the-art technologies — with a reservation of rights expressed pursuant to Article 4(3) of Directive (EU) 2019/790.
Art. 53(1)(d)
Training-data summary
Draw up and make publicly available a sufficiently detailed summary about the content used for training of the general-purpose AI model, according to a template provided by the AI Office.
The Article 53(2) free-and-open-source carve-out exempts GPAI providers releasing models under such a licence from points (a) and (b) above — but the copyright policy (c) and training-data summary (d) obligations apply regardless of licence. Models presenting systemic risk pick up the Article 55 obligations regardless of distribution mode.
Systemic-risk overlay
When a GPAI model is classified as having systemic risk, an additional layer of obligations applies on top of Article 53:
Threshold
Art. 51(2): a GPAI model is presumed to have high-impact capabilities — and therefore systemic risk — when the cumulative amount of computation used for its training measured in floating-point operations is greater than 10^25.
Notification
Art. 52: providers shall notify the Commission without delay and in any event within two weeks after that requirement is met or it becomes known that it will be met. Notifications may include a substantiated argument that the model exceptionally does not present systemic risks.
Additional obligations
Art. 55: model evaluations including adversarial testing; assessment and mitigation of systemic risks at Union level; tracking, documenting, and reporting serious incidents; ensuring an adequate level of cybersecurity protection.
Codes of practice
Art. 56: the AI Office facilitates the drawing-up of codes of practice at Union level, in consultation with GPAI providers and other relevant stakeholders, to contribute to the proper application of the GPAI obligations.
Penalty band — Art. 101(1)
Up to €15M or 3% of global annual turnover
Imposed by the European Commission directly via the AI Office — distinct from the Article 99 enforcement chain that runs through Member State market-surveillance authorities. The Commission can take into account whether a GPAI provider has signed up to the Article 56 codes of practice.
Practical compliance
License Compliance Checker generates the structured audit evidence that supports each Article 53 sub-obligation: per-model training-data manifests for the Annex XI pack, rights-reservation signal detection (TDM opt-outs, robots.txt, ai.txt) for the Article 4(3) policy, and the public training-content summary in the AI Office template format.
bashpip install license-compliance-checkerWhen the AI Office is the audience
Tools surface evidence. Programmes hand it over.
GPAI obligations sit on top of an upstream supply chain — pre-trained weights, datasets, fine-tuners, downstream deployers. AskAjay's Liability Ledger framework maps that supply chain so the Article 53 evidence pack from LCC has the programme context to land in front of the AI Office.
Explore the Liability Ledger framework at AskAjay.ai →