Tech Tokenization is an effective way to distribute reward, maintain accounts, and fractionalize. The three core functions can be addressed by a general purpose technology (GPT) as accounting and fractionalizing should be service agnostic, where service is anything which adds tangible value. What is value may seem hard to quantify but in a tokenized world, value is dynamic, measurable, enhanceable and competitive. As dynamic as the periodicity of its input data. Measurable as an understandable metric. Enhanceable as its underlying algorithmic process learns. Competitive as it is operating in a level playing field. Any service which adds tangible value, aspires to be distributed globally and get its fair share of generated value.
The beauty of AI is that it can enhance itself. This means that even if we are in the time of domain-specific AI growth, AI will generalize itself and evolve into a cross-domain interdisciplinary functionality, which means AI will power ultra-smart agents and become what we loosely refer to as Web 4.0. Building a generalized […]