Ursula K. LeGuin once wrote, “There is no right answer to the wrong question.” And while AI might struggle to understand the quip, we human readers get the point immediately. To solve a problem, you have to focus on what really matters about it.
That’s a big problem with the ongoing debate, most recently joined by Professor Nicholas Creel, over whether artificial intelligence models “learn” the same way as humans.
The simple answer, spelled out well in Erik J. Larson’s book “The Myth of Artificial Intelligence,” is that they don’t. They can’t handle or generate novelty, they become stymied when forced to operate with uncertainty or incomplete information, and they lack empathy and any real conception of understanding. They can copy and exploit almost infinite volumes of data, but they cannot extrapolate or intuit new truths from mere scraps of information as humans can.
I asked Microsoft’s AI, Copilot, the question, “Does AI reason in the same way humans reason?” This is the answer I received:
“AI relies on large datasets and algorithms to make decisions and predictions. It processes information based on patterns and statistical analysis. AI follows predefined rules and models to arrive at conclusions. It doesn't have intuition or emotions influencing its decisions. AI can learn from data through techniques like machine learning, but this learning is based on mathematical models and not personal experiences.”
As the Human Artistry Campaign’s Moiya McTier has explained, real creativity flows from far more than crunching big data sets to pull out patterns and connections. It “is the product of lived experience and grows organically from the culture, geography, family, and moments that shape us as individuals.”
Thus, it’s clear that AI learns and produces outputs in fundamentally different ways than humans do.
But for those of us living and dealing with AI in the real world, including musicians like me whose work has been scraped off the internet and fed into AI models without any kind of consent, it’s also clear this sterile philosopher’s dispute isn’t that important. Per LeGuin, the real question we should be asking is what AI does — and whether that is worth the cost.
In order to develop their models and launch their pattern-detecting algorithms, it is indisputable that AI companies must cause a machine to reproduce copyrighted works or produce a new work derived from copyrighted works. They also distribute the work across a large network. These are three exclusive rights reserved to authors under federal law.
Normally, a company that wishes to engage in this kind of activity would simply license the works from authors. But AI companies, ostensibly competitors, have all pretty much decided not to license the works but use them anyway, effectively setting the price for these copyrights at zero.
That’s a mass devaluing of the world’s creative legacy — a huge cost in lost opportunities and jobs for real people. What’s more, it will create a dumbed down and derivative culture and, if left unchecked too long, a gaping empty hole where the next generation of truly fresh or ...