Government opens consultation on copyright and AI
Seeks a workable balance between creatives and AI developers
The UK government has announced a consultation on how copyright materials can be used to train AI models.
It comes as prominent artists including musician Paul McCartney and novelist Kate Mosse have warned of a threat to creators’ livelihoods if copyright rules are relaxed.
With the UK AI Bill set to be introduced early next year, the issue over copyrighted works has emerged as a sticking point.
Industry body TechUK called has called the current arrangements, a mix of ad hoc deals and post facto legal action, “the worst of all worlds”, arguing that there should be a clear opt-out option for holders of licensed content to give clarity to developers and assurance that they will not be sued down the line.
A government press release says the consultation, which will run until 25th February 2025, aims to “drive growth across both [creative and AI] sectors by ensuring protection and payment for rights holders and supporting developers to innovate responsibly.”
It continues: “The consultation also explores how creators can license and be remunerated for the use of their material, and how wide access to high-quality data for developers can be strengthened to enable innovation across the UK sector.”
The government notes that previous attempts at voluntary agreements over the use of copyrighted materials for training have failed.
Some publishers, including the Financial Times, Condé Nast, News Corp and The Atlantic have signed deals with AI companies, but many more, including The New York Times, The Center for Investigative Reporting, a coalition of Canadian publishers and broadcasters and Mumsnet are suing OpenAI and Microsoft. News Corp, despite signing a deal with OpenAI, is suing AI search engine Perplexity.
In short, it’s a mess.
“While licensing deals provide a model for fair remuneration, the lack of uniform legal frameworks creates uncertainty for both creators and developers,” commented Ralph Arrate, data protection, AI and cybersecurity partner at law firm Spencer West LLP. He added that the consultation is a “much-needed step towards clarity in the complex relationship between copyright and AI.”
The government consultation proposes “introducing an exception to copyright law for training for commercial purposes” while allowing rights holders to opt out “so they can control the use of their content.”
This would require AI developers to “be more transparent about their model training datasets and how they are obtained.”
Secretary of state for Science, Innovation and Technology, Peter Kyle, said, ”This is all about partnership: balancing strong protections for creators while removing barriers to innovation; and working together across government and industry sectors to deliver this.”
Baroness Stowell, chair of the House of Lords Communications and Digital Committee, which has undertaken significant work on the impact of AI on news organisations, welcomed the consultation, but emphasised that strong enforcement will be crucial.
“If this consultation leads to an ‘opt-out’ model, it is vital publishers and other copyright holders have the necessary information to understand how and where their data is being used and are supported by enhanced enforcement mechanisms so they can enforce their copyright and block AI web crawlers should they choose to do so,” she said.
“It is not fair to expect rights holders alone to enforce their copyright in an often-mismatched legal battle against the bottomless pockets of the tech giants.”
Spencer West’s Arrate added: “To foster innovation without undermining creators’ rights, any reform must prioritise transparency and enforceability.
“AI firms should be required to disclose the datasets used for training, ensuring compliance with copyright law. Simultaneously, streamlined licensing mechanisms—perhaps through collective licensing or standardised agreements—could enable creators to benefit financially while supporting AI development.”
Ultimately, international arrangements will be needed he said.