LeoGlossary: Tokenization

5 mo (edited)
1 Min Read
153 words

How to get a Hive Account

The process of tokenization is where data is taken and assigned a value known as a token. Security is dependent upon the algorithm and process used.

Often tokenization is the producing of non-sensitive data (the token) to represent sensitive data. This is a process that is not usually reversible, at least without more data. Hence, the encryption offers a level of security for the sensitive data.

With cryptocurrency, markets develop which assign tradeable value to the tokens or coins. Since it all exists in the virtual world, there are no physical representatives of what is created although tokens can be generated to represent physical objects.

The tokens or coins tied to a blockchains will have all transactions recorded on a distributed ledger.

Many feel this is an essential component of Web 3.0.


Posted Using LeoFinance Beta