Setting the stage for token engineering. We reference and explain Foundations of Cryptoeconomics. How tokens represent digital value that can be accounted for and exchanged through distributed ledgers. How token networks, constructs of smart contracts, enable sharing of value globally without borders - gradually underminig the theory of the firm, and creating a more equitable vision of the future beyond the gig economy. How token networks naturally integrate with cyber-physical systems, in some sense monetize these systems.
Taking the reader to the lookout point to see the interconnecitivity of the buzzwaves of the past two decades: From Coase's Pneguin and The Wealth of Networks to Uberization to [digital organizations](https://www.placeholder.vc/blog/2020/5/7/aragon-daos] - noting how through each wave the ebb and flow of decentralization and centralization is visible, and how with each wave the technology and social layers become more intertwined, more accessible to mainstream and more representative of digital native generations.
In 2020, however stakeholders were still trying to squeeze these new types of organizations and value creation into templates that we have outgrown: Such as the Business Model Generation, humongous spreadsheets, or human-centered design that somehow ends up centered around a giant paper wall with hundreds of scribbled post-its with zero involvement of the people that the designs will affect. Even our engineering best practices fall short of enabling co-creation of knowledge that is needed to create sustainable prosocial token economies. Somehow still too many clinge to the comfy national boundaries of constitutions, institutions and corporate boundaries between business, research, engineering, finance, legal - which in token economies of Information Societies do not exist. Bootstrapping regenerative, i.e. profitable and sustainable, networked organizations that are open source and liquid, is against all existing frames of references.
This introductory chapter gives roots in the foundations of cryptoeconomics and wings to fly over the grounds we want to cover in following chapters: Transdisciplinary Art of Doing Science & Engineering, Cryptoeconomic Patterns, Token Model Generation, Privacy-preserving, Participatory Architectures, SecDevOps in Token Networks, Decentralizing Organizations, and Legal Engineering. Finally, we provide a Chapter dedicated to Online Resources, which will be kept up-to-date here; as well as an overview of the Token Model of this Book in the Appendix.
Most of the early stages in 2017-2019 the token engineering focus has been on the intersection of Economics (Game Theory, Mechanism Design) and Computer Science (Cryptography, Peer-to-Peer Protocols and Applications) with some insights coming in from Complexity Economics (e.g. Ergodicity Economics) and Computational Economics, Computational Social Sciences etc. 2018-2019 in addition saw the opening up of the field entirely to the practices typical in engineering, and insights flowing in from Industrial and Systems Engineering, AI, Optimization and Control Theory as well as Operations Research & Management Science. In fact, it is more common to see people with strong data science background, and systems engineering background enjoying the intellectual challenges that this emerging transdisciplinary and highly automated, data-driven field present. Quite new, equally exciting, and even more challenging is the connection with the less computerized discpilines such as Philosophy, Ethics & Law, Psychology & Decision Theory, Political Sciences and Governance.
In this chapter we outline which aspects, subtopics, in each discipline apply to the field of token engineering, and how. We start a list of these cross-relations which will be maintained and grown in the Online Resources chapter. We also outline existing and potential interfaces between the disciplines. Token engineering or the practical challenges of designing, testing, deploying, and bootstrapping token economies create new insights at the intersection of and frameworks for applying tools and models from the diverse disciplines of cryptoeconomics. A lot of the theoretical combinations in this chapter come to life in Chapters Token Model Generation, Privacy-preserving, Participatory Architectures, SecDevOps in Token Networks, Decentralizing Organizations, and Legal Engineering. Especially interesting in the curation of the book thus far was how venturing into Humanities helped us improve Token Model Generation with a "Moral Compass" for the "Ethical Engineer" and resulted in the new Epilogue An Interface to Moral Philosophy for Token Engineers.
While we keep the content of these chapters general, and "evergreen" for any print versions or editions, the case studies and examples sections will be kept up-to-date and growing in this living repository.
It is too early to be able to discern many patterns or primitives. But we certainly see how few patterns are emerging and being used in many different applications, environments with different actors. This chapter will give a general overview and short history of stablecoins and bonding curves that gave rise to algorithmic hence decentralized exchange between tokens followed by decentralized finance and composable decentralized financial transactions built on mostly the fungible, and increasingly the non-fungible token standard (as of Q2 2021).
The main accomplishment of this Chapter is abstracting the important information from the analysis of the two waves in 2017 (Initial Coin Offerings, Stablecoins, Decentralized Exchange Protocols) and 2020 (mostly Decentralized Finance, Governance Tokens, Non-/Fungibility, Social or Creator Tokens): Value of a token - not its price, quantitiy, velocity etc. - and decomposing it into Value Creation, Capture, and Distribution patterns or mechanisms. Instead of a fixed taxonomy, over over dependence on , this abstraction enables to think of the token as the "produce" of the token network. It starts with appreciating the value origination potential of the underlying network as well as the utility perspectives of the diverse participants (in community: builders, speculators, users; as well as the wider ecosystem). At the same time, it can be used alongside quantitative valuation models.
We will start a list and link to various examples and case studies, which will be kept up-to-date and growing in this living repository. There is always a next wave.
Business Model Generation, Lean Startup Methodology, Design Thinking - all have one common denominator: take apart your idea into its key assumptions, and validate those assumptions with stakeholders and with as little time and money as possible in order to be agile. This design principle works wonders when it meets in the sweat spot of continuous delivery in software development. Token Model Generation fully buys in into the principles of hypothesis-driven development, and the agile manifesto. We refer to it as experiment-driven design & development. We are fully aware of "dark agile" and "dark scrum" and focus on avoiding falling back into command-and-control execution of linear plans.
Yet we acknowledge that we are building critical, public, decentralized infrastructure at global scale - once the token smart contracts are deployed on mainnet. Thus, we add to the mix above Systems Thinking facilitated by Computer-aided Design. This addition helps us to make visible the counter-intuitive dynamics of token networks in computer simulations way before any development or changes need to be implemented. The Computer-aided Design tools we address in this chapter also enable data-driven optimization of the designed system dynamics and emergent (multi-)agent dynamics.
Furthermore there are some networks, which require more human deliberation and coordination. Sometimes, even educational onboarding. Breaking the original cypherpunk notion of "trustless" transactions. From experience these networks are the most impactful, but hardest to get off the ground. Hence we designed a course as part of this chapter, which synthesizes most useful aspects of design canvases to help formulate key assumptions: purpose (or objective function) of the token network, about its participants and their incentives for participation, which are intrinsic, extrinsic and systemic (internalized extrinsic incentives) and create hedonic and utilitarian value. The participatory environment and policies we then create through ecosystem design and token model can sustain, reinforce, amplify at best - or destroy these values in the worst case.
In this Chapter we wire it all together and strive to give you the most practical insights into design, development, and deployment of token models. We round it up with - yet experimental - experiment-driven development methodology that shall help the participants to coordinate in their context.