What can past market crashes teach us about the current one?
What are the prospects for long-run economic growth?, the present study looks at a more recently launched hypothesis, which I label Singularity. The idea here is that rapid growth in computation and artificial intelligence will cross some boundary or Singularity after which economic growth will accelerate sharply as an ever-accelerating pace of improvements cascade through the economy. The paper develops a growth model that features Singularity and presents several tests of whether we are rapidly approaching Singularity. The key question for Singularity is the substitutability between information and conventional inputs. The tests suggest that the Singularity is not near.
Are We Approaching An Economic Singularity? Information Technology And The Future Of Economic Growth – Introduction
What are the prospects for long-run economic growth? One prominent line of economic thinking is the trend toward stagnation. Stagnationism has a long history in economics, beginning prominently with Malthus and surfacing occasionally in different guises. Prominent themes here are the following: Will economic growth slow and perhaps even reverse under the weight of resource depletion? Will overpopulation and diminishing returns lower living standards? Will unchecked CO2 emissions lead to catastrophic changes in climate and ecosystems? Have we depleted the store of potential great inventions? Will the aging society lead to diminished innovativeness?
However, the present study looks at the opposite idea, a recently launched hypothesis which I label Singularity. The idea here is that rapid growth in computation and artificial intelligence will cross some boundary or Singularity, after which economic growth will accelerate sharply as an ever-increasing pace of improvements cascade through the economy. The most prominent exponents are computer scientists (see the next section for a discussion and references), but a soft version of this theory has recently been advanced by some economists as well (Brynjolfsson and McAfee, 2014)
At the outset, I want to emphasize that this is not a tract for or against Singularity. Rather, the purpose is two-fold. First, I lay out some of the history, current views, and show an analytical basis for rapidly rising economic growth. Next, I propose several diagnostic tests that might determine whether Singularity is occurring and apply these tests to recent economic behavior in the United States. In the end, I hope that the analysis and tests will allow us to keep a running scoreboard as to whether the economic universe is on a stagnationist or accelerating path … or possibly in that middle ground of steady growth.
II. Artificial Intelligence and the Singularity
For those with a background primarily in economics, the present section is likely to read like science fiction. It will explain the history and a modern view about how the rapid improvements in computation and artificial intelligence (AI) have the potential to increase its productivity and breadth to the extent that human labor and intelligence will become increasingly superfluous. The standard discussion in computer science has no explicit economic analysis and leaves open important economic issues that will be addressed in later sections.
It will be useful to summarize the argument before giving further background. The productivity of computers and software has grown at phenomenal rates for more than a half-century, and rapid growth has continued up to the present. Developments in machine learning and artificial intelligence are taking on an increasing number of human tasks, moving from calculations to search to speech recognition, psychotherapy, and robotic activities on the road and battlefield. At the present growth of computational capabilities, some have argued, information technologies will have the skills and intelligence of the human brain itself. For discussions of the background and trends, see Moravec (1988), Kurzweil (2000, 2005), Schmid and Cohen (2013).
The foundation of the accelerationist view is the continuing rapid growth in the productivity of computing. One measure of the productivity is the cost of a standardized operation in constant prices, shown in Figure 1. The costs of a standard computation have declined at an average annual rate of 53% per year over the period 1940-2012. There may have been a slowing in the speed of chip computations over the last decade, but the growth in parallel, cloud, and high-performance clusters as well as improvements in software appear to have offset that for many applications.
See full PDF below.