AI Music is Everywhere. Is it Legal?

October 21, 2025

AI art is everywhere now. According to the music streaming platform Deezer, 18 per cent of the songs being uploaded to the site are AI-generated. Some of this stuff is genuinely cool and original – the kind of work that makes you rethink what art is, or what it could become.

But there are also songs that sound like Drake, cartoons that look like The Simpsons, and stories that read like Game of Thrones. In other words, AI-generated work that’s clearly riffing on – or outright mimicking – other people’s art. Art that, in most of the world, is protected by copyright law. Which raises an obvious question: how is any of this legal?

The AI companies claim they’re allowed to train their models on this work without paying for it, thanks to the “fair use” exception in American copyright law. But Ed Newton Rex has a different view: he says it’s theft.

Newton Rex is a classical music composer who spent the better part of a decade building an AI music generator for a company called Stability AI. But when he realized the company – and most of the AI industry – didn’t intend to license the work they were training their models on, he quit. He has been on a mission to get the industry to fairly compensate creators ever since. I invited him on the show to explain why he believes this is theft at an industrial scale – and what it means for the human experience when most of our art isn’t made by humans anymore, but by machines.

Previous
Previous

How to Survive the “Broligarchy”

Next
Next

Geoffrey Hinton vs. The End of the World