On the Practice of Token Engineering, Part III: Something is Missing
It may be difficult to imagine, but there was a time when engineering was considered not just a useful field, but a downright sexy one. These were the days of canals that bridged oceans, and bridges that joined continents. There was even a utopian political movement in the early 20th century, Technocracy Inc., dedicated to moving all governing function into the hands of engineers and scientists.
While Technocracy Inc failed politically, the concept of putting the organization of life into the hands of experts still has many adherents. There’s something restful about the notion of experts ‘running things.’ But however alluring, it’s telling to recall that one of Technocracy Inc’s major proposals was eliminating all colors but gray, black, and white from consumer goods like clothing and automobiles. It was ‘inefficient’ and ‘unscientific’ to have color in daily life.
The messy, colorful realities of life, I think, require that even as we build world-changing systems, we maintain a human-centered, social frame of reference. Human-centeredness requires appreciating the vast individual, cultural, and even temporal variations among the same participants. In order to be truly humanist, we have to abandon assumptions about utility and rationality, and instead, simply embrace human behavior as a native signal about humanness.
Rather than design systems which attempt to optimize on behalf of humans, we instead focus on systems which enable and empower people to organize themselves with a minimum of cognitive overhead. This is a lofty goal and arguably one that is unattainable, yet striving humanistically helps us spot and avoid the unintended emergence of true technocracy.
So far in our series on the practice of token engineering, we’ve talked a great deal about things around engineering. We’ve talked about the future of blockchain-enabled tools and technologies, how we believe they’re likely to become as essential to daily life as the semiconductor.
What we haven’t discussed is the actual process of engineering, the fundamental reason that ‘token engineering’ properly understood, is better than something like ‘token design’ or ‘token coding.’ Perhaps unsurprisingly, it’s down to process.
Engineering as a practice breaks out into two mutually-reinforcing phases: design and development. Between them, they form a procedural loop that builds, tests, and revises some finished product, as illustrated in Figure 1.
We begin at the top left gathering and developing requirements. As requirements firm up, we design and test subsystems, verifying that the subsystem does what it’s supposed to, and validating that what the systems do meets the requirements we’ve established.
The myriad projects now in development on distributed ledger technologies demonstrate that there is an abundance of exciting new ideas and applications, not to mention pure technical know-how, in the blockchain community. What concerns us here are the parts of the process around the collection of requirements and the modeling and testing of components and systems against those requirements.
In his bestselling book The Seven Habits of Highly Successful People, author Stephen Covey states that the second habit is to “Begin with the end in mind.” He may well have been writing about engineering. Covey’s point is to draw attention to the power of imagination, to see what might be in the future if a person keeps their goal in mind. For our purposes, it means that a blockchain project should know exactly what it wants to achieve. It’s great to want to revolutionize life as we know it, but “Change the World” is a bumper sticker, not a business plan.
The process of figuring out the “how” and the “what” of “Change the World” leads us to the heart of engineering. It requires a thorough and rigorous process of collecting and analyzing information, of interviewing stakeholders and partners, and comparing that information against technical capabilities. The object of all this requirement gathering is to determine the actual problem to be solved in the service of “Change the World” and, hopefully, identify avenues to a solution.
It’s important that we begin with collecting these requirements, however tedious it might seem. Beyond the ethical implications of moving the goalposts during the game, clear requirements keep projects on track. They don’t have the luxury of being changed to suit technical capabilities. Technical capabilities must be advanced to suit requirements. But neither should some new capability be employed just because it’s new: it must also meet the requirements of the project. We have to verify that a subsystem works, and validate that it meets our requirements.
This verification and validation framework, rigorously testing every subsystem against both functional and ecosystem requirements, is what makes engineering the powerful toolkit that it is. Without that framework, projects are largely indistinguishable from a random-ish process of trial and error.
If you look again at the image above, you’ll notice that separating the two vectors of the ‘V’ is what we’re calling the ‘emergence chasm.’ It’s what makes distributed ledger technologies (and all complex systems) so exciting, and so challenging. With physical or mechanical systems, behaviors are generally known. Things like gears and torque, electricity and resistance operate according to known principles. It’s highly unlikely that the current in a circuit is going to come to a dead stop, absent some kind of outside interference.
Systems involving humans do not have determinism as a feature. We have not yet found a set of Newton’s laws that govern human behavior. (The great man himself is said to have said he could predict the movements of heavenly bodies, but not the madness of crowds.) Instead, human systems tend to be emergent. In much the same way as snowflakes make blizzards or raindrops make floods, humans acting en masse create unintended outcomes. And it is precisely this tendency that we are trying to encourage and shape with token engineering.
Within our engineering process, we have a hard division between the mechanistic and non-mechanistic. Think of a building. Whether it’s a skyscraper or an office park, the building has walls, a roof, HVAC, and electricity all set up in some useful configuration. But the physical plant does not dictate what goes on inside the building, although anyone who has suffered an “open plan” office knows how poor design can impede work. The companies and organizations within the building are free to build themselves into their vision of ‘the best’ without worrying about keeping out the elements or keeping the lights on.
This is how we envision token engineering: building out systems that, rather than try to optimize humans, try to provide the maximum degree of freedom while preventing dangerous conditions.
All of this promise, however, is built on a rigorous engineering framework that requires clear goals and requirements, and continuous validation and verification of subsystems into a cohesive whole. There is no lack of ideas in the distributed ledger technology community. And there is no lack of software development know-how and tools to aid in the construction of new systems.
What is missing is a toolkit that can enable and drive the simulation, validation, and verification of new blockchain systems. At least, it was missing. Over the last twelve months, the Block Science team has been building out that toolkit. We call it cadCAD, partly because it forms a tidy acrostic for what it is:
BlockScience is a research and engineering firm specialized in the creation of digital twins of complex adaptive systems, enabling deep insight into their complex interdependencies, and ways to determine and avoid the conditions and thresholds for systemic failure. Follow us on Medium or Twitter to stay in touch. If your project requires professional system design support, drop us a line at firstname.lastname@example.org.