In Tyler Cowen’s book The Great Stagnation (2011), he argues that the U.S. has been in an economic plateau since approximately 1973, and one of the main reasons is a slowing of technological innovation. In particular, he references a graph by Jonathan Huebner, a physicist, which took major technological innovations as presented in The History of Science and Technology (2004, Bryan Brunch and Alexander Hellemans, including 7,198 significant innovations from 1453 to recently before the book was published), and plotted them over time relative to global population (‘A possible declining trend for worldwide innovation’, Technological Forecasting and Social Change, 72:980-986, 2005).
The peak, in terms of the modified Gaussian distribution, would be 1873.
In a previous post, commenting on Bruce Charlton’s hypothesis that the absolute amount of scientific advancement is falling, I said:
“Near the end of the 19th century, dramatic technological changes involving electrical power, the internal combustion engine, airplanes, and wireless telegraphy (i.e., radio), to name a few, were taking place. My working guess for a peak for technological change in terms of how it affects people would be then.”
So, Huebner’s graph corresponds to an extent with my guess based on reading about historical technological changes, and comparing that to my experience of technological change nowadays. My interest in technological change comes from my hunch that scientific ‘output’ is falling relative to ‘input’, i.e., given the resources devoted to it, we are getting less scientific progress than in, say, the 19th century. This is combined with the idea that significant technological advance is often driven by significant scientific advance, so a way to measure significant scientific advance is through significant technological advance.
John Smart, a systems theorist, has responded to Huebner (‘Measuring Innovation in an Accelerating World’, Technological Forecasting & Social Change, 72:988-995, 2005). Smart’s most interesting rejoinder, as I see it, is as follows:
“[T]echnological innovation may be becoming both smoother and subtler in its exponential growth the closer we get to the modern era. Perhaps this is because since the industrial revolution, innovation is being done increasingly by our machines, not by human brains. I believe it is increasingly going on below the perception of humans who are catalysts, not controllers, of our ever more autonomous technological world system.
Ask yourself, how many innovations were required to make a gasoline-electric hybrid automobile like the Toyota Prius, for example? This is just one of many systems that look the same “above the hood” as their predecessors, yet are radically more complex than previous versions. How many of the Prius innovations were a direct result of the computations done by the technological systems involved (CAD-CAM programs, infrastructures, supply chains, etc.) and how many are instead attributable to the computations of individual human minds? How many computations today have become so incremental and abstract that we no longer see them as innovations?
[… Our brains] seem to be increasingly unable to perceive the technology-driven innovation occurring all around us.”
The basic idea here seems to be that much innovation is occurring ‘under the hood’. That is, we just don’t realize how much innovation is involved to do things like create a gasoline-electric hybrid car. So, it might seem like technology isn’t advancing as much as one would expect (given increases in the world’s population), but that isn’t really the case because it’s ‘hidden’.
I’m sure this is true to an extent – there is a large amount of innovation nowadays that we don’t really recognize as it’s hidden in the technological artifacts. Yet, what Huebner is trying to gauge isn’t ‘innovation’, but ‘important innovation’ (“For the purposes of this paper, the rate of innovation is defined as the number of important technological developments per year divided by the world population.”, p. 981). We don’t care how much more complex something is, or how many brute computations went into it, but whether it solves a relevant problem in a way that’s significantly better than the previous solution.
That is, what isn’t important is ‘complexity’. Any competent designer can tell you that complexity often impedes good solutions instead of moving them forward. To say that cars are more ‘complex’ nowadays than before is not to say that the solutions are much better – indeed, it hints that the solutions are marginal and lacking in significant advances in the underlying science.