Copyright and AI: The Government Steps Back from a Training Exception

Copyright and AI UK government report March 2026 Bratby Law data protection regulation

Copyright and AI policy in the UK reached a turning point on 18 March 2026, when the government published its Report on Copyright and Artificial Intelligence, together with an economic impact assessment. Both were required by sections 135 to 137 of the Data (Use and Access) Act 2025. The report confirms what the UK’s creative industries had lobbied for: the government’s previously preferred option, a broad copyright exception for AI training with a rights reservation opt-out, is no longer on the table.

Faced with balancing the interests of an established domestic creative sector worth GBP 124 billion a year against those of a nascent and predominantly US-based AI industry, the government has stepped back from what Sir Humphrey would call a “brave decision”.

It also recommends removing the longstanding protection for computer-generated works under section 9(3) of the Copyright, Designs and Patents Act 1988. No legislation has been introduced. The report is non-binding. But it narrows the field of possible outcomes and raises practical questions for any organisation developing or deploying AI systems that process copyrighted material.

Regulatory background

The report fulfils a statutory obligation imposed during the passage of the DUAA. Section 135 required the Secretary of State to publish an economic impact assessment of the options consulted on. Section 136 required a report on the use of copyright works in AI development, covering technical standards, data access, text and data mining, transparency, licensing and enforcement. Section 137 imposed a deadline: both had to be laid before Parliament by 18 March 2026.

The copyright and AI consultation ran from December 2024 to February 2025 and received over 11,500 responses. Four options were assessed. Option 0: no legislative change. Option 1: strengthen copyright to require licences in all cases. Option 2: a broad text and data mining exception for AI training, with no restrictions. Option 3: a data mining exception with a rights reservation opt-out, modelled on the EU Digital Single Market Directive. Option 3 was the government’s originally preferred approach. Of those who responded via the government’s online survey, 88% supported strengthening copyright (Option 1) and only 3% supported a broad exception (Option 2).

The House of Lords Communications and Digital Committee published its own report on 6 March 2026, recommending a licensing-first approach and calling on the government to rule out the opt-out model.

Analysis: where copyright and AI policy stands

The report’s central conclusion on copyright and AI is that the government no longer has a preferred option. The broad exception with opt-out (Option 3) has been abandoned. But the government has not endorsed mandatory licensing (Option 1) either. Instead, it says it will discuss other approaches with stakeholders, including a focused exception targeted at specific types of use or application, which was the most popular alternative proposal in consultation responses.

The impact assessment does not make a decisive economic case for any option. It sets out the respective contributions of the creative industries (GBP 124 billion to the UK economy in 2023, 2.4 million jobs) and the AI sector, but acknowledges that the evidence base on how copyright reform would affect AI development in the UK remains limited and uncertain. This is the core difficulty: the government cannot model the economic effects with sufficient confidence to act.

On computer-generated works, the position is more definitive. Section 9(3) CDPA protects literary, dramatic, musical and artistic works generated by computer in circumstances where there is no human author, granting 50 years’ protection to the person who made the arrangements necessary for the work’s creation. Most consultation respondents considered that works created solely by AI should not receive copyright protection.

The report recommends removing section 9(3), while retaining protection for AI-assisted works where a human author makes creative choices. This distinction between AI-generated and AI-assisted works will become important for any business producing content using generative AI tools.

As we noted in our analysis of the DUAA’s commencement and ICO enforcement powers, the DUAA is a pragmatic reform that fixes specific problems without departing wholesale from the EU model. The copyright provisions follow the same pattern: they create reporting obligations and narrow the options, but stop short of legislating.

Commercial and operational implications

For organisations that develop or deploy AI systems processing copyrighted material, the copyright and AI report produces three practical consequences.

First, the licensing question remains live. The government has not introduced a training exception and has signalled that it will not do so in the broad form originally proposed. Any AI developer scraping or ingesting copyrighted content for training purposes continues to do so under existing copyright law, which requires a licence unless an existing exception (such as the non-commercial research exception under section 29A CDPA) applies. Developers cannot rely on a future copyright and AI exception that has not been legislated.

Second, the section 9(3) recommendation, if implemented, would remove copyright protection from AI-generated outputs where no human author is involved. Businesses using generative AI to produce marketing copy, reports, code or other content should consider whether their workflows involve sufficient human creative input to qualify as AI-assisted (and therefore protectable) rather than AI-generated (and therefore unprotectable). The distinction will turn on the degree of human selection, arrangement and judgment applied to the AI output. Organisations that treat raw AI outputs as finished products face the greatest exposure.

Third, the transparency agenda is advancing. The House of Lords report recommended granular transparency obligations around AI training data, going beyond the high-level summaries required under the EU AI Act. While no UK legislation is in place, the direction of travel points toward mandatory disclosure of training data sources. Organisations should begin documenting their training data provenance now, both as a compliance preparedness measure and because licensing negotiations will increasingly require it.

For data protection practitioners, and those advising on AI regulation more broadly, the interaction between copyright and AI on the one hand and data protection on the other is worth watching. Personal data used in AI training engages both regimes simultaneously. The ICO’s guidance on AI and data protection already requires a lawful basis for processing personal data in AI training. The copyright report does not address this overlap, but any future exception for AI training would need to be reconciled with data protection requirements.

Viewpoint

The report reflects the political reality that the UK’s creative industries, employing 2.4 million people and contributing GBP 124 billion to the economy, carry more domestic weight than a nascent AI sector whose largest players are predominantly US-headquartered. The government has ruled out its own preferred option but committed to nothing in its place. For rightsholders, that is a partial victory: the status quo, which requires licensing, remains. For AI developers, it is a partial setback: no safe harbour for training on copyrighted material is coming soon.

The practical message on copyright and AI is straightforward. Organisations building or procuring AI systems should not wait for legislation. They should audit their training data sources, establish licensing arrangements where they do not already exist, and document the human creative contribution to any AI-assisted outputs they wish to protect. The section 9(3) question will take time to resolve through legislation, but the principle, that pure AI outputs are unlikely to attract copyright, is already the working assumption for most practitioners. Building workflows around that assumption now will avoid costly reclassification later.

Links

Contact

For advice on copyright and AI compliance, data protection for AI-enabled products, or the implications of the DUAA for your AI strategy, contact Rob Bratby at Bratby Law.

Similar Posts