Complexity, Human Alignment and the Evolution of AI

Carlos E. Perez
Intuition Machine
Published in
3 min readAug 11, 2021

--

Photo by Rahul Pandit on Unsplash

The evolution of software development can be understood from the reference frame of attempts to manage complexity.

The development of more abstract and expressive computer languages introduced modularity, late-binding, functional application, and other concepts that made it possible for programmers to better structure and understand the programs they were writing.

In addition, development processes like lean and agile made it possible to manage the complexity within a group of developers.

Software development created methods that made it possible for the single mind and a group of minds to build ever more complex abstractions. These methods were further enhanced with software technology that supported the methods.

Less than 1% of the world’s population is involved in software development. This is unfortunate because the ideas of managing complexity are the same problems of human governance yet we are ignorant of most of these ideas.

This lack of understanding of software is also pervasive in other scientific fields. Most science is performed using concepts that existed before the invention of the computer. Many are unaware that our immersion with computers generates entirely new universal ideas.

Humanity is involved in many difficult complex governance problems (i.e. climate change, pandemic) where most people involved in these fields are unaware of the concepts and tools invented by software developers to tackle complexity.

Human civilization is critically dependent on humans to express complex ideas. Unfortunately, too many of us have never learned these newer vocabularies. When we are exposed to them, we interpret the expressions of the experts in the wrong ways.

Even worse, too many people are dismissive of experts that they do not understand. Kafka wrote that we are all heroes in our own minds. But that should not imply that we are also the exclusive source of all truth.

Christopher Alexander noticed that the complexities of expressing tacit architectural decisions. He realized that we can improve our designs by systematically formulating our tacit knowledge into new expressions. He called this pattern languages.

amazon.com/Pattern-Langua…

Software development took Alexander’s ideas in architecture and created their own pattern languages to describe the complexities of software development. As a result, there are vocabularies layers above source code and development processes.

There is a unique language and way of doing things in the world’s most advanced software development organizations. This kind of efficient coordination technology is only slowly creeping into other fields of human endeavor.

We do not have enough people who are familiar with software development that can cross into other fields of human endeavor. Furthermore, fields will have extreme pushback against outsiders that are pursuing a paradigm shift in their fields.

Society has pressing needs to address the complexities of our world, yet there is too much inertia in too many fields to grasp how we can effectively address complexity. Human governance is stuck with antiquated and ineffective vocabularies to address the hard problems.

Wittgenstein pronounced against philosophy “The limits of language are the limits of my world.” We cannot extricate ourselves from the mess we are with the antiquated language that we employ every day.

How can we learn new vocabularies to manage the complex? Is there a pattern language for complex systems? Is there a pattern language that has at its emphasize the alignment of human needs?

One of the interesting developments in software is agile methodology. If you read the original manifesto, it is not hard to notice its alignment with human needs and not the needs of the system that is built. agilemanifesto.org

If we dig deeper, managing complexity is inseparable from aligning with human needs. So when we begin deploying more advanced artificial intelligence, we cannot avoid how it is entangled with the needs of human interaction.

How can we build tools to take our language and hence our thinking to the next level? How does the latest development in empirical AI (i.e. Deep Learning) contribute to this agenda?

If we take the definition of intelligence as the ability to fill in the blanks in instructions, then AI is just a smarter interpreter. Useful AI is the kind that can interpret the intentions of humans to generate satisfactory solutions.

More advanced AI will continue along the same historic path of technologies that manage greater complexity. It is unfortunate that people’s understanding of intelligence is shallow enough not to see where AI is leading.

--

--