AI Copyright: Govt Rethinks Opt-Out model After Industry Backlash

AI Copyright: Govt Rethinks Opt-Out model After Industry Backlash

IN THIS ARTICLE

The UK government is reassessing its proposed approach to copyright and artificial intelligence following sustained opposition from the creative sector, parliamentary committees and rights holders.

Earlier proposals explored allowing AI developers to use copyrighted material for training unless rights holders actively opted out. That approach has not been taken forward in its original form, with ministers now under pressure to adopt a framework that places greater emphasis on licensing, transparency and creator protection.

 

Initial government proposals

 

The government’s consultation on AI and copyright, launched in December 2024, examined how UK law could support the development of AI models while maintaining protections for rights holders. One of the central options was a broad text and data mining exception that would allow AI developers to use copyrighted works for training purposes unless the copyright owner had opted out.

The policy objective was to improve access to high-quality training data and strengthen the UK’s position as a location for AI development. However, the proposal raised immediate concerns about how realistic it would be for rights holders, particularly individuals and small businesses, to monitor and enforce opt-outs across large-scale datasets.

 

Industry and parliamentary response

 

The proposal prompted strong resistance from across the creative industries, including music, publishing, journalism and visual arts. Stakeholders argued that an opt-out system would shift the burden onto rights holders and risk undermining established licensing markets.

Parliamentary scrutiny reinforced those concerns. Parliamentary committees highlighted the potential economic impact on the UK’s creative sector and questioned whether the model struck an appropriate balance between innovation and intellectual property rights. Reports called for clearer safeguards, improved transparency and mechanisms to ensure fair remuneration where copyrighted works are used.

 

Current position

 

There is, at present, no confirmed legislative reform introducing a broad opt-out regime for AI training. The government is continuing to consider consultation responses and policy options.

In the meantime, the legal baseline remains unchanged:

 

  • UK copyright law continues to apply to the use of protected works in AI training.
  • No general exception permits commercial AI training on copyrighted content without permission.
  • The existing text and data mining exception is limited to non-commercial research under section 29A of the Copyright, Designs and Patents Act 1988.

 

This means that, in most commercial contexts, the use of copyrighted material for AI training is likely to require a licence or other lawful basis.

 

AI training law in the UK

 

For employers and businesses, the immediate issue is not future reform but current compliance. AI systems rely on large datasets, and where those datasets include copyrighted material, organisations may face exposure if appropriate permissions have not been obtained.

Risk typically arises in three areas:

 

  • Use of third-party datasets where provenance and licensing terms are unclear
  • Deployment of AI tools trained on data that may include protected content
  • Internal data scraping or aggregation exercises involving external sources

 

Liability can extend beyond developers to commercial users, particularly where businesses integrate AI outputs into products, services or client deliverables.

 

Data governance and compliance considerations

 

Organisations adopting AI should treat training data and model inputs as a compliance issue rather than purely a technical matter. This sits alongside existing obligations under intellectual property law, data protection and contractual risk management.

Key steps include:

 

  • Conducting due diligence on AI vendors, including how training data has been sourced
  • Reviewing contractual terms to allocate risk relating to IP infringement
  • Maintaining records of datasets used in internal AI development
  • Implementing policies on acceptable data sources for training and testing
  • Assessing whether licences are required for specific datasets or content types

 

These controls are likely to become increasingly important as regulatory scrutiny develops and enforcement activity evolves.

 

What to expect next

 

The government is expected to continue engaging with stakeholders before setting out a revised policy direction. Any future framework is likely to focus on balancing access to data for AI development with clearer rights for creators and more workable licensing mechanisms.

Further developments may include:

 

  • Proposals for standardised or collective licensing models
  • Increased transparency requirements around AI training data
  • Clarification of liability across the AI supply chain
  • Alignment with international approaches to AI and copyright regulation

 

Until then, businesses should proceed on the basis that existing copyright rules continue to apply in full.

 

 

Key takeaways

 

The UK government has stepped back from its earlier opt-out proposal for AI training, but has not yet introduced a replacement framework. Copyright law continues to apply, and organisations using or developing AI need to assess how training data is sourced, licensed and governed. The direction of policy points towards greater accountability, not less, for the use of protected content in AI systems.

 

Author

Gill Laing is a qualified Legal Researcher & Analyst with niche specialisms in Law, Tax, Human Resources, Immigration & Employment Law.

Gill is a Multiple Business Owner and the Managing Director of Prof Services - a Marketing Agency for the Professional Services Sector.

lawble newsletter sign up

Subscribe to our newsletter

Filled with practical insights, news and trends, you can stay informed and be inspired to take your business forward with energy and confidence.