After I shared my plan to become Cognitively “Perfect”, my friend Adam Grant sent me Cognitive Repairs: How Organizational Practices Can Compensate for Individual Shortcomings.
The paper highlighted that even though we as individuals are riddled with biases, when we form social structures — tribes, companies, countries — these biases can get smoothed out or canceled.
The paper further argued that individuals are bad at all kinds of tasks important to companies, including figuring out reasons for why things went wrong, reasons for why things went well, collecting information, and drawing conclusions. Basically, everything.
Applying the findings to one of my endeavors, Kernel, helped me make the argument concrete. When I started Kernel in 2016, my objective was to make the most revolutionary advance in the field of neuroscience around reading and writing neural code. I wanted to bring the brain “online”. The problem: many people had ideas, but no one actually knew what to do. Starting a company under these conditions is atypical. Most people start a business knowing what they’re going to build, not trying to figure out what to do.
Regardless, I decided to move forward because I believe our radical cognitive upleveling is the single highest value thing humanity can focus on. Everything is downstream from our brains.
As we begun this process, the question was not, which of these low-hanging fruits in the field of neuroscience is the best to pursue. We asked, “What’s even possible?” I couldn’t work this out in isolation — my biases and blind spots would have kept me from the best answers. But by building a social structure — a company — to tackle it, my biases and blind spots, and those of everyone else on my Kernel team, were balanced and smoothed out so we could together arrive at the best paths forward.
Studies show that individuals will, when asked to come up with solutions for problems which have 10 types of categories of solution/answer, will only come up with three or four. You need a group of people to hit all ten. If there were answers to how to get into the brain in a revolutionary and novel way, they would be found somewhere among the knowledge and opinions of many people. Each person, ideally, smoothing out or cancelling the biases and other knowledge and cognitive shortcomings of the others.
It got me thinking about what happens when we increase group level complexity. For example, if we go from company to, say, country.
For most of human history, countries and institutions have been run by individuals — Kings, Queens, Popes, Presidents, Shahs, etc. Which means that all the individual biases are now back in the extreme; the scale of the consequence can be equally as disastrous, i.e. war, genocide.
Experiencing this in the 20th century, we upleveled group function again, adding worldwide organizations such as the UN, treaties and various other corrections all to deal with the concentration of individual biases.
Generally speaking, it seems each time we level up in complexity in society, individual biases pop back out, and then some sort of mesh network (can be a social tribe, a company, the UN, etc.) comes in and attempts to correct for the biases. Whether or not those attempts have been effective is a different discussion.
Whether it be the climate, nukes, tech platforms, or the complexity associated with the interconnectedness of billions of people, we have new types and levels of complexity coupled with the return of individual biases. Individuals at the heads of companies now run the most powerful newspapers the world has ever seen. World leaders are playing poker over Twitter. This creates excessive risk for all of us. We need another, more advanced mesh of corrections.
The key insight here, human organization alone is no longer capable of managing the complexity and providing the necessary smoothing and correction.
This is where AI can shine. It has the power to offer us “strategically unprecedented moves”, in detecting and correcting the biases at our modern complexity scale that we simply cannot do ourselves.