Tony Blair’s institute just published an AI report calling for the UK to rip up copyright law. Here’s what it says…

Credit: I T S / Shutterstock.com
The Tony Blair Institute was founded by the former UK Prime Minister (pictured)

MBW Reacts is a series of analytical commentaries from Music ComeOn written in response to major recent entertainment events or news stories. Only MBW+ subscribers have unlimited access to these articles.


An institute run by Tony Blair, an ex-Prime Minister of the UK, has published a report that contains a list of suggestions that could fundamentally impact how music copyright is treated in the UK, in the AI age.

The report, titled Rebooting Copyright: How the UK Can Be a Global Leader in the Arts and AI presents what it calls a “progressive solution” that appears to prioritize AI advancement over established creator rights.

The report’s authors don’t mince words about its bias toward big tech and AI developers, boldly stating that “the progressive solution is not about clinging to copyright laws designed for an earlier era but allowing them to co-evolve with technological change.”

Authored by Jakob Mökander, Amanda Brock, Mick Grierson, Kevin Luca, Zandermann, and Joseph Bradley, with a foreword by Professor Fernando Garibay of the Garibay Institute, the report argues that “the UK can be home to both cutting-edge AI development and a flourishing creative sector.”

However, its recommendations appear to consistently favor AI developers’ interests, with the report explicitly stating that “there are better ways of supporting the creative industries in the digital age than through restrictive copyright laws for AI-model training.”

The Tony Blair Institute, founded by the former UK Prime Minister, positions itself as a non-partisan think tank that develops “practical solutions” to complex challenges, with a particular focus on technology policy.

But music rightsholders reading this report may find little comfort in its vision of copyright’s future.

Central to the report’s recommendations is support for the government’s proposal for a text and data mining (TDM) exception with an opt-out mechanism that would fundamentally shift how music rights work:

“This would make it legal to train AI models on publicly available data for all purposes, while giving rights holders more control over how they communicate their preferences with respect to AI training,” states the report.

In practice – this means AI companies could freely use music and other content unless creators actively opt-out.

The report notes that TDM exception “is already EU policy, and it has support from the prime minister in the AI Opportunities Action Plan (AIOP)”.

The report, which you can read here, follows ChatGPT maker OpenAI’s call for fundamental changes to copyright law in the United States that would allow AI companies to use copyrighted works without permission or compensation to rightsholders.

OpenAI expressed those views in its submission to the Trump administration’s request for information on developing a national AI Action Plan.

Both OpenAI and Google submitted detailed policy frameworks that could significantly impact music rightsholders and other content creators.

Sir Paul McCartney, Paul Simon, and Bette Midler joined hundreds of Hollywood celebrities in signing a letter pushing back against the proposals.

The Tony Blair Institute report also arrives amid ongoing litigation brought by music rightsholders against AI companies such as OpenAI and Anthropic, as well as AI music generators Udio and Suno.

Here are five key aspects of the report that music rightsholders might find particularly concerning:


1. The report’s suggested ‘AI-Preferences Standards’ May Not Adequately Protect Music Copyrights

The report’s first recommendation calls for strengthening of “AI-preferences standards” for rightsholders.

According to the report, “the UK government should support internationally harmonised AI-preferences standards developed for an effective opt-out regime.”

It adds: “These standards must go beyond the limitations of robots.txt (files used to manage crawler access to a website), offering rights holders greater control over the use of their content while incorporating pragmatic commitments from developers to trace and respect these preferences.”

“The UK government should support internationally harmonised AI-preferences standards developed for an effective opt-out regime.”

While acknowledging that current systems like robots.txt are insufficient, the report’s solution still places the burden on music rightsholders to actively prevent their works from being used, rather than requiring permission first.

This represents a fundamental reversal of how copyright traditionally works. The report advocates for a “tools-not-rules” approach that favors technical solutions over legal enforcement.

According to the report, “open-source tools can play a crucial role in operationalising these standards, providing a ‘tools-not-rules’ approach that fosters innovation. This was suggested for security management in AI at the recent AI Action Summit, through the launch of ROOST.”

For music rights holders, who have historically relied on clear legal frameworks to protect their work, shifting to technical opt-out tools represents a significant risk. The report acknowledges fundamental problems with this approach:

“The fundamental problem with opt-outs is that the internet is structured around unique resource locators (URLs), but the works protected by copyright are not. For example, a recording of a song encompasses songwriting, performance, recording rights and more.”


2. There are flaws in the report’s ‘Multi-Pillar Transparency Approach’

The report’s second recommendation advocates for a multi-faceted “transparency approach”, noting that “the UK government should implement policies that include pragmatic disclosures from AI developers, attributional transparency and private regulatory scrutiny.”

For music rightsholders, the vague commitment to “pragmatic disclosures” falls short of providing the detailed information needed to track and control how their works are used in AI systems.

The report acknowledges the limitations of transparency approaches, noting the difficulty of comprehensive URL–level disclosures. According to the report, “hosting a database of tens of billions of URLs could be expensive for small AI developers and push the market even further into concentration.”

What’s concerning for the music industry is that this reasoning prioritizes convenience for AI developers over rightsholders’ need to know exactly which works are being used in training for licensing purposes.

The report also mentions “attributional transparency,” which relies on tools that can extract training data from AI systems after the fact:

“Attribution tools will allow outsiders to detect training data in the output of generative-AI work,” the report adds. “These tools will improve and others will become available as the industry matures, allowing rightsholders who suspect that developers are abusing their results to test this and litigate accordingly.”

Transparency around AI training data is a key issue for the global music industry. Over in the United States, the RIAA, NMPA, and a group of other music organizations recently recommended in their joint AI Action Plan submission that AI companies maintain detailed records of training materials and provide reasonable summaries of works used in AI model development.

They also support the proposed TRAIN Act, which would create a court-administered process for copyright holders to investigate potential unauthorized use of their works.


3. The report has a concerning suggestion for a “one-off exception” to license decades worth of content

The report’s third recommendation focuses on establishing standards for AI creativity.

According to the report: “To safeguard the creative industries, clear standards must be established regarding creativity and licensing in AI applications. The UK government should introduce a one-off exception allowing major rights holders to license the past 75 years of content for AI training, as recommended in the AIOP.”

“The UK government should introduce a one-off exception allowing major rights holders to license the past 75 years of content for AI training, as recommended in the AIOP.”

This proposal for a “one-off exception” to license decades of content raises serious concerns for music rightsholders.

The report acknowledges the complexity of music rights, explaining that, “the UK has an enormous heritage of art and media that is not available on the open web” and that “this work is worth billions but comes with a challenging rights problem. Each work will be tied up with dozens of rights holders, each exercising complex partial rights.”

Yet despite acknowledging this complexity, the report suggests bypassing normal licensing procedures:

“Only governments can unlock archived content by granting a one-off exception to distribution for rights holders that allows them to relicense archived work for AI training without the explicit permission of all relevant rights holders,” the report states.

For the music industry, particularly those rightsholders with interests in historical recordings, this represents a potential government-sanctioned override of their established rights.


4. they want to set up a “Centre for AI and Creative Industries” and get consumers to pay for it with an ‘ISP levy’

The report’s final recommendation addresses supporting the creative sector’s transition into the AI era:

According to the report, “the UK government should adopt a proactive approach to supporting the creative sector’s transition into the AI era”. It adds that “this can be achieved through targeted funding and the establishment of a new Centre for AI and Creative Industries (CACI).”

To pay for the new Centre for AI and Creative Industries, the report’s authors suggest that consumers should be taxed via a so-called “ISP Levy”.

“as the tax is on consumers, there are no direct impacts on technology companies.”

The report adds: “The purpose of the levy would be to facilitate the transition of the creative industries into the generative-AI era in a socially progressive way, and to recognise the existence of bad actors in the scraping world. It would therefore not need to raise huge amounts to meet its goals.

“Working on the basis that there are 116.1 million subscribers to mobile-data plans (including machine-to-machine) and 28.5 million broadband subscribers, with monthly average subscriptions of £20 and £50 respectively, a tax rate of 0.1 per cent would yield total revenue of nearly £45 million. To reach a target revenue of £200 million, the tax rate could be increased to 0.44 per cent, resulting in consumers paying only about 31p extra per month.”

The good news for tech companies, according to the report is that, “as the tax is on consumers, there are no direct impacts on technology companies, apart from the reduction in revenue implied by the small income affect and intermediate effects on ISPs.”

The report later states later that this “remuneration could also be directed to those artists who choose not to opt out, as a reward for their contribution to AI development.”

However, the report’s calculation raises serious questions about whether such compensation would be proportionate to the value of music being used by AI systems.

For context, global recorded music revenues alone, according to the latest figures from IFPI, were USD $29.6 billion in 2024.


5. The report suggests that AI training is just like human learning.

One of the report’s key arguments in support of its position on AI model training is its comparison between AI training and human learning.

The report claims: “To argue that commercial AI models cannot learn from open content on the web would be close to arguing that knowledge workers cannot profit from insights they get when reading the same content.”

“To argue that commercial AI models cannot learn from open content on the web would be close to arguing that knowledge workers cannot profit from insights they get when reading the same content.”

The report adds: “There are also better ways of supporting the creative industries in the digital age than through restrictive copyright laws for AI-model training. The question is not whether generative AI will transform creative industries (it already is) but how to make this transition equitable and beneficial for all stakeholders.

“AI is already being integrated into creative workflows, automating routine tasks while enabling new forms of expression. Moreover, the economic impact will vary across sectors and individuals. Rather than fighting to uphold 20th-century regulations, rights holders and policymakers should focus on building a future where creativity is valued and respected alongside AI innovation.”

“Was Tracey Emin expected to reimburse Louise Bourgeois for the transformative experience she had upon encountering her work at the Tate in 1995?”

The report elaborates on its human-AI learning comparison elsewhere in the report, with the following example to support its argument:

“For example, many workers in the knowledge economy read the news,” the report’s authors write.

“They then sell their general knowledge, including what they learned reading the news, as part of their work. They owe no extra money to the newspaper beyond what they might have paid to access it.

“They might cite the paper, but this is not required by law. Similarly, artists visit galleries, often with no entrance fee, to explore a variety of creative works.

“Was Tracey Emin expected to reimburse Louise Bourgeois for the transformative experience she had upon encountering her work at the Tate in 1995?  The relationship between originality and imitation has always been ambivalent, from classical art to the present day.”

 Music ComeOn

Related Posts