Is Innovation Doomed?
Back when I was in college and obsessed with things like the classification of 3-manifolds and the zeta function over a finite field, and was thinking about going to graduate school for mathematics, one of the sayings my classmates liked to pull out of their pocket protectors was that the best mathematicians do all their work at a very young age. Everyone knew the story of Evariste Galois
, who revolutionized abstract algebra while he was still popping pimples and ended up dying in a duel by the age of 21. Naturally, when I was that age I had the pimples but not much else, which meant that there was no future for me whatsoever. Better to become a hack journalist and be done with it.
At the same time, though, the idea that you either revolutionize a field by the time you're 21 or drop out was plainly ridiculous. As math gets more and more complex and advanced, any scholar has to master more and more fields and theorems and techniques in order to get to the frontiers of mathematics, and over the years the trend should be that older
mathematicians are making all the important discoveries, while the younger ones are racing to catch up by reading an ever-expanding foundation of textbooks and journals. Evariste Galois, if born today, would be buried in the library rather than revolutionizing anything. Indeed, innovation should get harder and more time-consuming as the years go by, because sadly, no one gets to be born at the frontier of knowledge.
It's not just math, either. Economist Benjamin Jones has just put out a new paper
arguing that this trend ought to hold for all technological innovation. It makes sense: innovators can't just stand on the shoulders of giants past; they also need to spend the bulk of their early years climbing up the backs of those giants, and hence, it takes longer to acquire all the education necessary to do any decent innovation. Although we tend to think that innovation happens exponentially—Moore's law says the speed of a microprocessor doubles every 18 months—in fact, it may start slowing down if that "knowledge burden" increases over time. Jones says that, in fact, this burden is doing just that. (Yes, his model is more complex than simply pointing out that you need more and more education to get to the point where you can do original work, but that's basically what it comes down to.) By the same token, the death of the Renaissance Man—the innovator who is a master of multiple fields—is creeping over the land, and the extremely narrow-focused specialists are ruling the roost.
(There's one countervailing trend here that Jones only glances on. Many technological innovations may well make people smarter, or alleviate the "knowledge burden". The internet, for one, is a splendid little way of aggregating a good deal of information, and at a certain point, probably makes people smarter than they would be without it. Obviously you have to get to the point where you know enough that you can ask the right questions, but once you can, Google gets you down the answer path far more quickly and comprehensively than ever before. Still, on the surface, it doesn't seem like even the internet is enough to counteract the increasing "knowledge burden.")
Jones squares his theory with a number of empirical observations. For one, looking at a vast set of patent data, he notices that the "age at first innovation" is trending upwards at 0.6 years per decade. Specialization is increasing. Co-authorship is increasing. People are taking longer and longer to get their doctorates. And innovators are, more than ever, working in teams rather than alone. This also may well explain why total-factor productivity growth has been more or less flat in recent times even though R&D spending has been rising dramatically in leading economies. Obviously more spending on research and development is always a good thing, but it's not clear that it will create a proportional rise in innovation.
If we want to have a little fun here, we can take this study in places it was never meant to go. For instance, on the global stage, if the rate of technological innovation starts decreasing over time for those at the forefront, then it will be exceedingly hard for any one country to maintain its dominant position in the world. Tom Friedman likes to point at our crouching tiger, hidden dragon friend to the east and declare that the United States needs to innovate, innovate, innovate if we want to stay ahead of China. But what if this is like the tortoise running after Achilles? Sure, we might always be able to stay ahead, but if innovation gets increasingly more difficult, and China simply has to play the easier game of catch-up, then eventually they're going to converge with us on the technological front. That could go for our pre-eminent military position as well: our National Defense Strategy is, to some degree, predicated on the idea that we'll always have much cooler weapons than our enemies. But what if that gets harder to do over time? Eventually the gap will shrink, no?
Meanwhile, the prospects for long-run economic growth, especially productivity growth, start to look pretty pessimistic, especially when you keep in mind that the rate of global population growth is slowing down over time. (Indeed, robust population growth that provides society with more and more innovators, according to Jones, doesn't really alleviate the "knowledge burden" trend, but it helps. The problem is we don't even have robust population growth.)
Of course, the hopeful view is that we will come up with some cool technologies that make the "knowledge burden" shrink. For example, if we could somehow be teleported onto the shoulders of giants previous—imagine that we had some gadget like that in the Matrix, where knowledge was simply uploaded into the human brain—than the rate of innovation would take off again. Alternatively, if people start living and working longer and longer, then it doesn't really matter how long it takes to acquire enough knowledge to come up with an original idea, now does it? If we still have future versions of Evariste Galois, only they happen to be 75 instead of 16, well that doesn't seem like such a big deal. The question is whether this will be enough to counteract the "knowledge burden." As doomsday scenarios go, it's a fun one.