Ostensibly, this article is about the music industry and how artists get paid, but it takes a really smart turn into the current lawsuits about AI "theft" of content. It's pretty long, but well worth the read. Here's a few notable passages:
The economic argument against AI training - "nobody will create if they can't monetise it" - is empirically false. We're witnessing the largest explosion of voluntary creativity in human history, happening right now, while people argue that creativity will die without stronger copyright protection.
But when everyone can access distribution, discovery becomes the new bottleneck. The game changed from "who can press and ship CDs" to "who can get on Today's Top Hits."
Notice the pattern? Panic, lawsuits, then new revenue streams for businesses. Creativity never died - it exploded every time. But the businesses crying wolf always found a way to get paid. The artists who were supposedly being protected? Different story.
Artists accept radius clauses, compete for playlist placement, chase viral TikTok moments. In the attention economy, exposure often matters more than direct payment. That's why bands pay to play Coachella.
To be clear: artists should get paid. But the current system - the one copyright maximalists are desperately defending - already ensures most don't. When Spotify pays $0.003 per stream, when labels take 80% of what's left, when artists pay to play festivals, the system is already broken. The publishers crying "theft" about AI training are the same ones who built an economy where artists create for exposure instead of income. They're not protecting artist compensation - they're protecting their own extraction model.
The architecture is the opposite of copying in fact. These systems compress patterns from training data into abstract relationships - like how you might remember that "desserts often follow main courses" without memorising every meal you've eaten. (This is why LLM's are prone to cliches!) When they occasionally output something resembling copyrighted work, it's because the training process saw that exact phrase too many times and the statistical weight became too strong. This is a bug, not a feature.
This pattern reveals the real game. The people most upset about AI training aren't worried about creativity. They're worried about their economic niche.
Plagiarism is necessary, progress implies it.