The Lessons of the First Industrial Revolution series is a collaboration between Michael Baxter (Techopian) and MKAI – an international community of leaders in Artificial Intelligence.
The story of the last two hundred years is that technology innovation is followed by economic growth. It will be like that with AI. But the lesson of the last two hundred years is that it takes time.
I would like to start with a mystery. I believe that the golden age of innovation, the era that changed the world, fell into a period from roughly 1867 to 1914. Yet, the golden age of economic growth didn’t begin until almost half a century later in the West. Why is that?
In between those two periods, one economist lamented the lack of innovation and came up with the phrase secular stagnation to explain it. Of course, with the benefit of hindsight, we can now say that lack of innovation was never the problem.
And yet today, I often hear much the same argument. The rate of innovation has slowed, say technology cynics, that is why the economy is weak; once again, they call it secular stagnation.
Allow me to explain.
Second industrial revolution
Vaclav Smil (a man Bill Gates describes as one of his favourite authors) defines the period of 1867 to 1914 as the age of symmetry. I think you could call it the Second Industrial Revolution. The period began with the invention of dynamite, was swiftly followed by the telephone and photographic film. The 1880s saw the first electricity-generating plants, electric motors, steam turbines, the gramophone, cars, aluminium production, air-filled rubber tires, and pre-stressed concrete. The early 1900s saw the first aeroplanes, tractors, radio signals and plastics, neon lights and assembly line production.
But instead of an economic boom, we got two world wars, and a Great Depression sat in between.
In 1930 Alvin Harrison said: “Population growth is fading. There are no new territories to settle and exploit. We can only hope for more technological advancement, so don’t do anything to hamper this last great hope of ours. Except that it seems that we are: the growing power of trade unions, trade associations, and other monopolistic practices are restricting technological advances. This is a great folly.” Harrison summed up it all up with just two words: secular stagnation.
I believe that post World War 2, we finally worked out how to turn on all those great innovations from a previous time into technology that supported improving productivity and also created products we could use. We also learned how to create the demand necessary to stimulate the economy at a time of rising productivity. The economy boomed for around a quarter of a century after World War 2. By the mid-1970s, we had largely used up the potential in those innovations from the past, and the economy slowed. Theories galore try to explain the economic problems of the 1970s. I think they ignore the key point— that we had fulfilled the potential of the past innovation. Thatcherism, Reaganomics and Neo-Liberalism were born in the maelstrom of that time, but I don’t think the underlying problem was understood.
For the next industrial revolution, we had to wait. We had to wait for exponential technological advances to reach a certain tipping point.
In 1987, Nobel winning economist Robert Solow said, “the computer age was everywhere except for the productivity statistics.” I am not sure whether Solow was being a technology cynic or just trying to provoke. But I would say that Solow was being too hasty in making that statement.
The computer revolution wasn’t everywhere in 1987; I know this because I worked in the computer industry back then. We were still a few years from realising Bill Gates’ dream of a computer on every desk. So no, the economic consequences of the computer age didn’t manifest themselves until the 1990s.
And now it is happening again. Robert Gordon, an economist from North Western University in Chicago, says that technological progress has slowed. Larry Summers — Treasury Secretary under Bill Clinton talks about secular stagnation.
But I believe that just like Solow in 1987 and Harrison in 1930, Technology cynics are way too hasty.
It is crazy to expect the economic consequences of AI, the Internet of Things, Immersive Technologies, and indeed other remarkable technologies such as CRISPR/Cas 9 and graphene to show up in the productivity statistics yet.
Prof Crafts, Professor of Economic History at the University of Sussex Business School, recently said: “It is a common misconception that the First Industrial Revolution is a template for a general-purpose technology (GPT) having a major adverse effect on workers’ living standards. The essence of that industrial revolution was not rapid productivity growth in the short run but the ‘invention of a new method of invention’, which increased technological progress in the long run. Since AI is potentially a general-purpose technology that raises the productivity of research and development, it may be the basis for a Fourth Industrial Revolution,” said Prof Crafts.
He added: “It is highly likely that AI will eventually become to be seen as a classic GPT and eventually deliver the much-needed boost to productivity that techno-optimists envisage once its full potential is realised. However, growth accounting estimates for earlier GPTs show that their impact on productivity takes time to develop.”
I couldn’t agree more.
For a similar reason, I am dubious about the inflation story rattling around at the moment. Sure we have inflation at the moment, but unlike in the 1970s, I don’t think it will define the decade. The 2020s will be the decade when AI and related technologies finally begin to boost productivity and create another economic boom. It is not likely we will get high inflation at the same time.