How Property Rights Would Prevent Robot Overlords From Taking Over The Economy

Updated on

How Property Rights Would Prevent Robot Overlords From Taking Over The Economy by Robert P. Murphy, Foundation For Economic Education

A growing number of technological optimists advocate a growing role for artificial intelligence in both government and the marketplace. Skeptics fear that even the most intelligently programmed artificial agents could end up displacing humans and taking over our institutions in runaway loops of inhuman logic. As we’ll see, the skeptics would benefit from a better understanding of basic economic theory.

Optimist Vitalik Buterin, co-founder of Bitcoin magazine, asks us to imagine, “a decentralized self-replicating cloud computing service” that “would start off running an automated business on one virtual private server, and then once its profits increase it would rent other servers and install its own software on them, adding them to its network” (“DAOs, DACs, DAs and More: An Incomplete Terminology Guide,” Etherium blog).

Can the Economy Fall into an Infinite Loop?

In a May 30 blog post titled “Ascended Economy?,” philosophical blogger Scott Alexander worries that under conditions like those Buterin describes, humans could eventually find themselves in what he calls an “ascended economy,” where “economic activity drifted further and further from human control until finally there was no relation at all.”

Alexander’s concerns are superficially plausible, but once we appreciate modern subjective value theory — especially in the Austrian tradition — we see that he doesn’t appreciate how the profit-and-loss system works.

Alexander first asks the reader to imagine a company that relies on a human inventor to develop electric car batteries, where the inventor has the goal of helping other humans. He continues:

“Now imagine the company fires the inventor and replaces him with a genetic algorithm that optimizes battery design. It fires all its employees and replaces them with robots. It fires the CEO and replaces him with a superintelligent business-running algorithm. All of these are good decisions, from a profitability perspective.…

Now take it further. Imagine that instead of being owned by humans directly, it’s owned by an algorithm-controlled venture capital fund. And imagine there are no soccer moms anymore; the company makes batteries for the trucks that ship raw materials from place to place. Every non-economic goal has been stripped away from the company; it’s just an appendage of Global Development.

Now take it even further, and imagine this is what’s happened everywhere. Algorithm-run banks lend money to algorithm-run companies that produce goods for other algorithm-run companies and so on ad infinitum.”

Later on, Alexander offers a specific example of a mining-robot company that buys “one input (steel) and produce[s] one output (mining-robots), which it would sell either for money or for steel.” He then asks us to further imagine a steel-mining company that takes “one input (mining-robots) and produce[s] one output (steel).”

The concern here is that the economy might even accidentally produce a feedback loop like this, where no human is involved, and yet the two operations “end up tiling the universe with steel and mining-robots without caring whether anybody else wanted either.”

Breaking the Laws of Reality

This is an intriguing scenario, but ultimately it doesn’t make sense. It’s unrealistic to say that there are no extraneous inputs or “leakage” in the operation, such that one unit of a robot and one unit of steel can perpetually produce identical physical specimens. For one thing, that would violate the laws of thermodynamics.

But it also violates the laws of economics. The only way both firms could stay in business is if the unit of steel had the same market value as a unit of robots, now and forever. But then this still implies that the nominal return on this industry is zero. In other words, it implies that a unit of steel today has the same market value as a unit of steel to be delivered in 10, 20, and 30 years — which would be an amazing coincidence, especially in light of the general fact of positive (long-term) interest rates.

Don’t Forget Property Rights

Yet these are mere quibbles. The real difficulty is that Alexander has implicitly assumed that the mines of iron ore (that’s how you make steel) are either unowned, or are owned by one of the two operations in the loop. There’s no danger of “out of control” robots creating trillions of copies of themselves without human approval, if the humans own the raw materials.

Finally, even if the robots could somehow multiply by only manipulating matter already within their legal control, the problem here would be one of ill-defined property rights. If it would bother humans to know that the solar system is filling up with robots, then assigning property rights to the various segments of space would be the solution.

In this context, Alexander’s worries about an “ascended economy” have nothing to do with private enterprise, and instead are analogous to someone worried about overgrazing cattle on public lands.

Human Subjectivity

Besides responding to Alexander’s particular illustration, more generally we can say that the modern subjective theory of value shows that market prices can be traced to the final customer. For example, the modern approach — that is, the approach Carl Menger introduced in his Principles of Economics (1871)  — says that people pay certain amounts for bottles of wine, because of the marginal utility they get from consuming wine.

Now, once the prices for bottles of wine are known, entrepreneurs can engage in economic calculation and determine how much they would be willing to pay for the factors needed to produce bottles of wine. The fact that (human) wine drinkers are out there, offering dollar bills for bottles of varying quality, ultimately induces bottlers to offer dollar bills to hire workers, buy grapes, build factories to produce bottles, etc.

To put the matter succinctly: In the old classical approach, we would say that a bottle of wine is expensive because it has to be, in order to cover the costs of devoting arable land to grapes. But in the modern subjective approach, we say that arable land is expensive to rent because it can be used to produce grapes, which in turn fetch a high price because they can be used to produce bottles of wine valued subjectively by final customers.

Although Austrian economists like Mises explicitly focused on consumers being the ultimate powerbrokers in the economy, it might be technically more accurate to refer to customers. The word “consumer” has the connotation that the product or service is serving a personal goal, and this is not always true, except in a vacuous sense.

As excellently argued in The Voluntary City: Choice, Community, and Civil Society (2009), a voluntary society will not rely purely on for-profit organizations. Fans of the free market shoot themselves in the foot if they lead critics to believe that there will be no role for fraternal orders, charities, foundations, or religious entities in a laissez-faire world. On the contrary, without paternalistic mandates and payment schemes from a coercive state, society would need such voluntary but non-commercial operations. Yet these institutions would still be embedded in the larger nexus of property rights and money prices, and they would still need to respond to the values of their paying (contributing) customers.

Thus we see that Scott Alexander’s worries are baseless. So long as there are well-defined property rights, each operation—whether controlled by a human mind or a computer algorithm—must pass the profit-and-loss test. This benchmark ultimately ties the enterprise back to the value judgments of customers, and is the market’s way of determining whether resources are being used efficiently. Alexander’s attempts to completely sever human influence or benefit from closed loops of automated operations simply don’t work. The only resources the machines would control, would be ones they had acquired by providing goods and services to the original (human) owners.

Leave a Comment