Is Tokenization In Its Infancy?

Tokenization is one aspect of a rapidly-evolving movement toward more fluid and decentralized forms of value exchange. This movement made its way into the mainstream with Bitcoin and Ethereum, the largest cryptocurrencies by trading volume, and has progressed with the advent of updated blockchains that support new digital currencies.

Get The Timeless Reading eBook in PDF

Get the entire 10-part series on Timeless Reading in PDF. Save it to your desktop, read it on your tablet, or email to your colleagues.

geralt / Pixabay

Tokenization is the process of assigning a token as a unit of value for the specific asset it represents. In the blockchain and cryptocurrency spaces, a token is a computing protocol for assigning a digital token to represent an asset so it can be used as a medium for value exchange in a network.

Though these spaces are relatively nascent industries, tokenization is not a new concept. In a sense, the U.S. dollar is an analog, generic, traditional asset token: It is considered legal tender for all public and private debts, is backed by the government, and like other fiat currencies, it represents a unit of value that people agree upon. If you were to tokenize the dollar on a blockchain, it would be worth about a dollar.

As recently as 1950, there were thousands of functioning forms of tokenized currency in the U.S. alone. Individual banks issued parallel banknotes, while industrial and agricultural trading companies would write their own “scrip” to streamline transactions with suppliers and buyers. Additionally, local and state governments, retailers, and property brokers offered alternate forms of currency for trade, often because the supply of actual government-issued currency was quite scarce or illiquid.

Even now, we use utility tokens for everyday services like car washes, laundromats, and arcades. We buy flights using airline loyalty points, gamble with casino chips, and trade in various gift cards. In this sense, tokenization doesn’t seem like a new concept.

Tokenization has become a hot topic largely due to the recent phenomenon of Initial Coin Offerings, or ICOs, which can also be called Token Generation Events, or TGEs. Whether it is called an ICO or TGE, a company will issue a new cryptocurrency, digital coin or token that investors can purchase as an investment in the company. Consequently, the company can raise millions of dollars in capital without using the conventional VC funding model. It is no wonder that in the last three years, we have gone from just a handful of known cryptocurrencies to more than 4,000.

The ICO model has led to many unrealistic market valuations; however, the rapid growth rate of the space attracts many buyers and founders who want to be part of the next big thing. Aside from speculation though, it is more beneficial to focus on the the value that tokenization can provide for the assets it represents and networks it supports. If the token is intrinsically required to support the business function of the project, it is more likely to be on the right track.

Serious blockchain project founders, engineers, and analysts are not talking about tokens in terms of prices and market valuations. They are interested in advancing tokens as a new way of exchanging economic value to build decentralized networks and get real work done. Indeed, in Stephen McKeon's blog post on traditional asset tokenization, he hints at a coming wave of tokenization of things far beyond the realm of cryptocurrencies and financial-type transfers of value.

Tech investor Nick Tomaino breaks down tokens into four categories: traditional asset tokens, usage tokens, work tokens, and hybrid tokens.

Traditional asset tokens are the crypto representations of standard fiat assets. Usage tokens provide access to a digital service. A good example of this is Bitcoin where users are granted access to a virtual payments network. Work tokens, meanwhile, offer users the right to contribute work to a decentralized organization. A popular instance of this type of token is Maker, which serves as the backstop in a collateralized debt system. Finally, hybrid tokens, a mix of usage and work coins, can be used for multiple purposes; one example of this is Ethereum tokens backed by the Proof-of-Stake algorithm, a system in which the creator of the next block is determined in a pseudo-random way, with a user’s chances depending on the amount of wealth in their account.

In conventional terms, everyone understands the idea of assigning value to an asset, whether it is a dollar, product, or house. We pay for usage of a service, whether that’s your cell phone plan, Netflix, or a gym membership. We can also grasp the idea of paying to work: You might rent an office in a location close to your customers, pay an ownership share to join a legal or medical firm, or obtain a commercial license to drive a truck. Finally, with respect to the hybrid model, you can pay dues to a professional organization or union to both obtain work and take advantage of shared services and data.

The major development is that tokenization on blockchains and decentralized networks can now create a significant increase in the ease of value exchange across all these dimensions. Trades can be fairer, transactions can happen faster, and costs can be lowered. In supply chain industry terms, tokenization provides units of value exchange that can cut costs and increase efficiency at every stage, from the moment a customer places an order, to the end state when the order is delivered and all suppliers are paid for their contributions. Tokens that are purpose-built for the industries and networks they support move much faster than money, and accomplish much more.

When applied correctly, tokenization aligns incentives for decentralized organizations to build customer-oriented solutions in a more engaged way, thus making winners of all participants, rather than focusing on growth at all costs, shareholder value, or exit strategies.

As the world becomes more decentralized and digitally connected, cash will play a significantly smaller role in financial exchanges. Some governments have already acknowledged this reality: India, for instance, has decided to eliminate high-denomination bills in a multi-year effort to increase financial transparency and liquidity.

Assets that are easily tradeable in the market are considered illiquid because buyers are not readily available, and thus, they tend to trade below their actual value. As McKeon pointed out: “Financial economists have attempted to measure the illiquidity discount in a variety of ways. A common rule of thumb is 20 to 30%. This represents a huge amount of value and therein lies great promise.”

Great promise indeed. If tokenization achieves its promise of recapturing liquidity not only for currencies — but also by digitizing the value of all assets, time, and work in a more flexible, fair exchange — that’s an exciting prospect. And given the vast amount of work that still remains, it’s fair to surmise that tokenization is only in its early days.

Article by Jason English, VP Protocol Marketing, Sweetbridge