The booming British industry under threat as we sleepwalk towards AI disaster
Artificial intelligence is putting some of our world-beating companies at risk like never before. Unless we act now, says Chris Blackhurst, there will be no going back
Don’t tell Liam and Noel, but the biggest entertainment event of 2025 won’t be their reunion, it’ll be the launch of Grand Theft Auto VI. The long-awaited video game sequel is expected to generate $3bn (£2.4bn) in its first 12 months – twice the figure earned by its predecessor, GTA V. By comparison, the biggest movies of 2024 – Inside Out 2 and Deadpool & Wolverine – could only muster $1.7bn and $1.3bn respectively. As for the Gallagher brothers and their record-breaking stint at Wembley, they won’t even be close. Already, the trailer for GTA VI has been viewed more than 225 million times on YouTube.
It’s also a rarity these days: a British, world-beating tech success. The GTA strand is developed by Rockstar Games, based in Dundee. In all, more than 2,000 people will have worked on the new game, in Scotland and elsewhere.
Not only is GTA proof that we should take video gaming more seriously, as something Britain does extremely well – the sector employs thousands and is a huge domestic wealth creator – but it also means we need to pay close attention to anything threatening that hegemony.
This is why, when the head of the trade body for the UK video gaming and interactive entertainment industry warns of growing danger, we should sit up and take notice.
Nick Poole, chief executive of UKIE (the Association for UK Interactive Entertainment), is concerned that we – and especially our government and lawmakers – are too complacent in our approach to artificial intelligence, that we run the risk of losing our jewels, of enabling AI and its use of large language models such as GPT. “If we want GTA VI’s successor to be made in the UK, it’s vital we don’t surrender to AI,” he says.
Poole’s concern is that we’re sleepwalking into disaster. “We really need to start developing category-scale responses to emerging tech rather than eternal catch-up,” he warns.
Last month, the government launched a consultation on copyright and artificial intelligence, which closes at the end of February. It’s seeking views on how the law can be underpinned to both combat and support AI. It’s the “both” that is so worrying. “Two major strengths of the UK economy are its creative industries and the AI sector. Both are essential to drive economic growth and deliver the government’s plan for change,” says the ministerial press release.
“Copyright is a key pillar of our creative economy. It exists to help creators control the use of their work and allows them to seek payment for it.” So far so good. But the consultation’s objective is also “ensuring AI developers have access to high-quality material to train leading AI models in the UK and support innovation across the UK AI sector”.
It points towards an outcome of compromise, to a product that resembles that other British classic: fudge. The fear is that AI continues to behave how it has done so far, which is to do precisely what it wishes. Except it will now have some sort of official endorsement.
“It seems axiomatic that there will be copyright in the source material on which the technology is ‘trained’ ... That copyright material ought to have been licensed with the permission of the rightsholder,” says Poole. “That it was not ... is not a flaw in copyright legislation. It’s a flaw in enforcement.”
We appear to be moving rapidly from a system that relies on the default of licensing to one that assumes an automatic right to use published material for AI unless the creator has exercised their right to be excluded.
The first is an “opt-in” that presumes a third party does not have the right to use copyright material unless the rightsholder or creator permits it. The second is an “opt-out” that presumes they do have the right to use the content unless the creator has exercised their right to opt out. “There is no good reason to shift this default,” says Poole, “and nor will doing so unlock innovation.”
It also creates a practical headache, he says, “familiar to anyone who had to configure a ‘robots.txt’ file in the early days of search. It just does not work. It is very much like building sandcastles to hold back a tsunami – technical prevention simply cannot keep pace with infringement.”
The contrast – and this will appeal to the Oasis duo – is in music, where the principle has long been that sampling requires licensing. But where AI is concerned, we now seem to think it’s OK for AI to use unlicensed copyright material.
“I think it is because governments are so terrified of losing the AI arms race that they are willing to undermine creative rights,” says Poole. “But they could focus instead on winning a different race: to be the prime movers not in ‘AI for its own sake’ but in ‘Responsible AI’ that is both ethical and sustainable.”
Keir Starmer and his colleagues must choose between protecting a UK creative industries sector that really does conform to their economic growth agenda, or seeing it subsumed in the name of global progress in tech. GTA or GPT? They must decide.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments