UK Government’s AI bill falters amid backlash from artists including Elton John and Dua Lipa
The British government’s ambitious plan to introduce a new data bill facilitating artificial intelligence (AI) training has hit a major stumbling block following fierce opposition from artists, authors, and creatives. The proposed legislation, which included a controversial copyright exception for commercial AI training, has been sharply criticised for undermining creators’ rights.
Introduced by Prime Minister Keir Starmer’s Labour government in october last year, the data (Use and Access) Bill was designed to support technological innovation by allowing AI developers to train their models using vast amounts of existing content. Crucially, the draft bill contained provisions that would have enabled companies to use copyright-protected works for AI training purposes without seeking prior consent from creators or rights holders. This move, the government argued, was necessary to ensure the UK remained at the forefront of AI development.
However, the creative community swiftly mobilised against the proposal, warning that it would amount to the legalised theft of intellectual property (IP). Global icons including Sir Elton John, Dua Lipa, and authors like Philip Pullman joined the chorus of voices denouncing the bill, warning that it set a dangerous precedent that would allow tech giants to exploit the creative works of others without fair compensation.
In a significant development on monday, 12 may, the House of Lords voted in favour of an amendment to the bill that strengthens copyright protections. The amendment requires that creators must give explicit permission for their work to be used in the training of generative AI systems. Furthermore, it mandates that artists, writers, and other creators must be notified about what content has been used, when, and by whom.
The amendment is seen as a major victory for the creative sector, which has long argued that unchecked AI training on copyrighted works threatens their livelihoods. Campaigners have voiced fears that AI models trained on copyrighted material without permission would ultimately produce outputs that directly compete with human-created content, thereby eroding the economic value of the original works.
Baroness Beeban Kidron, one of the leading voices behind the campaign against the bill, told the BBC that tech companies were effectively “stealing some of the UK’s most valuable cultural and economic assets.” While she acknowledged the potential of AI to bring creative and economic benefits, she argued that such progress must not come at the expense of creators’ rights. “We do not accept that we should have to build AI for free, with our work, and then rent it back from those who stole it,” she added.
Kidron went on to underline the scope of the issue, noting that everything from the Harry Potter franchise to the entire back catalogue of UK music publishers, the voices of iconic actors such as Hugh Grant, and even the intellectual property of universities, museums, and libraries could be exploited under the proposed law.
The creative industry’s concerns were echoed by a coalition of trade bodies representing musicians, authors, photographers, and designers, who warned that the government’s original bill risked undermining the country’s rich creative heritage for the benefit of a few large technology firms.
Following the Lords’ intervention, the bill will now return to the House of Commons, where MPs will decide whether to accept the changes. Although the government has insisted that fostering AI innovation remains a priority, the strong cross-party support for the amendment suggests ministers may be forced to rethink their approach.
As the debate continues, this episode has reignited broader questions over how the UK balances its ambitions to become a global AI powerhouse with the need to protect the rights and livelihoods of its creative workforce. The outcome is likely to have far-reaching implications not just for the UK, but for global discussions on copyright, AI, and fair use.