The Tony Blair Institute for Global Change (known as TBI) has released a stirring new report in which it advocated that Great Britain must champion and lead in shaping the application of artificial intelligence (AI) in the future of the creative arts. The report, Rebooting Copyright: How the UK Can Be a Global Leader in the Arts and AI, claims that Britain is placed in a unique position to set global AIl-driven-creative standards. The recommendations, however, have drawn vehement condemnation from artists, legal experts, and technology ethicists alike.
AI as the Next Great Creative Disruptor
The most recent report by TBI ponders on the comparisons that could be found between the present-day advancements in AIs with earlier technological revolutions including the likes of the printing press, the camera, and the Internet-all of which changed very quickly and radically the way art is produced and consumed. Even now, AI would be changing creative industries because it fuels new forms of music, visual art, and literatures.
Rather than replacing human creativity, the report suggests AI will expand it, leading to:
- Interactive and personalized art, where audiences influence storytelling in real time.
- AI-assisted creativity, helping writers, musicians, and filmmakers refine their work.
- A resurgence in human-made art, as audiences seek authenticity beyond AI-generated content.
Beyond the arts, AI is accelerating breakthroughs in medicine, disaster response, and scientific research. The report warns that countries slow to adapt risk falling behind in both cultural influence and economic growth.
The UK’s AI Ambitions—and the Copyright Dilemma
AI leadership has emerged as one of the UK’s topmost priorities. This AI Opportunities Action Plan was announced by PM Keir Starmer in January 2025 to try and align Britain as a world center for AI innovation. Nevertheless, the breakneck speed of AI insertion has raised issues of copyright and training data, which are urgent and in need of legal and ethical consideration.
The Current Debate: Opt-Out vs. Licensing
AI models like ChatGPT and Midjourney are trained on vast amounts of copyrighted books, music, and artwork. Under current UK law, using such material without permission could infringe on creators’ rights. The government has proposed a text and data mining (TDM) exception, allowing AI firms to scrape copyrighted content unless creators explicitly opt out.
The TBI report cautiously supports this approach but acknowledges major challenges:
- Enforcement difficulties: How will artists, especially independent ones, know if their work is being used?
- Legal uncertainty: Will an opt-out system hold up in court?
- Global inconsistency: Other countries, like the EU and US, have stricter rules, potentially putting UK AI firms at a disadvantage.
Critics argue that the opt-out model shifts the burden onto creators, many of whom lack the resources to monitor AI training datasets.
Backlash from the Creative Community
The report has faced strong opposition from artists, authors, and legal experts who believe it underestimates the risks AI poses to human creators.
Key Criticisms:
Misleading Claims on Copyright Clarity
Ed Newton-Rex, CEO of Fairly Trained, argues that UK copyright law is already clear: AI training requires permission. The report’s suggestion that the law is uncertain, he says, benefits tech companies over artists.
Opt-Out Means Less Control, Not More
Newton-Rex warns that an opt-out system would strip creators of their rights by default. Many—especially smaller artists—won’t know how or when to opt out, leaving their work vulnerable.
False Equivalence Between AI and Human Learning
The report compares AI training to how humans learn from books and music. Critics say this is misleading—AI can replicate and distribute works at an unprecedented scale, unlike human creativity.
Ignoring the Threat to Creative Jobs
The report downplays evidence that AI-generated content is already displacing human illustrators, writers, and musicians. Newton-Rex calls this omission “a major flaw.”
Questionable Policy Solutions
The TBI proposes a taxpayer-funded Centre for AI and the Creative Industries, but critics ask: Why should the public pay instead of AI companies?
British novelist Jonathan Coe also notes that none of the report’s authors are artists, raising concerns about bias toward tech interests.
Alternative Solutions: What Could Work?
While the TBI’s recommendations are contentious, the debate highlights the need for balanced policies that foster innovation while protecting creators. Possible alternatives include:
1. Mandatory Licensing & Royalty Systems
- AI companies could pay fees to use copyrighted works, similar to how streaming services compensate musicians.
- Collective licensing pools (like ASCAP for music) could simplify payments for writers and visual artists.
2. Stronger Transparency & Consent Rules
- Require AI firms to disclose training data sources.
- Give creators opt-in control rather than forcing them to opt out.
3. Legal Clarity Through Case Law
- Courts may need to rule on whether AI training constitutes fair use or infringement.
- The EU’s AI Act already sets stricter rules—will the UK follow or diverge?
4. Direct Support for Artists in the AI Era
- Government grants for human-led creative projects.
- Tax incentives for companies that employ human artists alongside AI tools.
The Global Stakes: Will the UK Lead or Lag Behind?
The UK’s approach to AI and copyright could have far-reaching consequences:
- If it leans too permissive, artists may lose income, leading to a decline in original human creativity.
- If it’s too restrictive, AI innovation could move to countries with looser laws, costing the UK its competitive edge.
The TBI insists that “bold policy solutions” are needed to strike the right balance. But with artists and tech giants at odds, finding middle ground won’t be easy.
Conclusion: A Defining Moment for Creativity and Technology
The TBI report has ignited a crucial debate: How can the UK harness AI’s potential without undermining the artists who fuel its growth? While the government pushes for AI leadership, the creative community demands stronger safeguards. The path forward will require compromise—ensuring that innovation doesn’t come at the expense of human expression.
As AI continues to evolve, one thing is clear: The decisions made today will shape the future of art, law, and technology for decades to come. The UK must choose wisely.