What if we could do for relationships what the bit did for information?
The word relationship is a funny thing. If you type it into Wikipedia you get a disambiguation page. Apparently in order to say anything meaningful about relationship we need to talk about a particular kind of relationship. There are statistical relationships and relational databases. There are intimate relationships and interpersonal relationships, and these things are considered distinct enough to each warrant their own entry. We use the word “relationship” to refer to our connections with lovers and with friends, to the connections that form between companies and consumers and between animals and plants and between proteins in a cell, but there is very little that’s consistent in that definition. There may be poetic similarities between these varying types of relationships, between the ways that atoms in a molecule relate and the way that human lovers relate, but the practical analytical consistency ends there. People who professionally analyze relationships between companies use a set of analytical tools that looks almost nothing like those who professionally analyze relationships between students in a classroom.
Before 1948, people viewed information in the same fragmented way that we view relationships today. There was a sense that information was fundamentally shaped by the form it was embedded in. There was information in books and in music and in speech, but if you wanted to work with any of those kinds of information you needed fundamentally different tools. Librarians necessarily had a different set of skills from musicians because the kinds of information that they were dealing with differed on a fundamental level, each requiring its own special blend of art and science.
Enter Claude Shannon. An eccentric 32 year old cryptographer working at Bell Labs, Shannon had spent much of his life fascinated with the concept of communication. It seemed to Shannon that there was something consistent that happened whenever information flowed between two points, regardless of what that information was. To demonstrate that similarity he invented a mathematical tool, a sort of extreme form of the game “20 questions.”
Let’s say I’m thinking of a word, and you have to guess it by asking me a series of “true or false” questions. Your first question could be “Is the first letter ‘a’?” or “Is the first letter in the first half of the alphabet?” Once you’ve guessed the first letter, you move on to the second. If you’re smart, you can figure out how to minimize the number of questions that you need to ask, to compress the information down to some minimum number of “true/false” statements. In Shannon’s view, the receiver was facing a degree of uncertainty, and any message could be compressed down to the minimum number of true/false statements required to resolve that uncertainty.
The nifty thing about this game is that you can use it to communicate more than words: you could communicate music (by asking about notes instead of letters), you could use it to communicate a picture (by breaking the picture into a bunch of tiny colored dots). Mathematically any piece of information can be broken down into a series of true/false statements or, (if you are an electrical engineer like Shannon) a series of 1s and 0s. Shannon called these atomic units of information “bits.”
For librarians and musicians, the bit changed everything. We no longer needed a special skill set and special tools for every kind of information we wanted to convey, we could engineer machines capable of storing, communicating, and presenting any kind of information in the universe. Information was no longer a poetic concept, it could be discretely quantified in bits, bytes (the sets of 8 bits required to represent letters), and megabytes. Information theory became a unified field of scientific inquiry, one that quickly spread from the emerging field of computer science to biology, physics, and beyond. It went on to fundamentally reshape not only how we send and receive messages, but our economy, our political reality, and our sense of self.
Which brings us to the core question of this post: what if we could do that again, but for the concept of relationships?
There is currently no widely accepted way to measure a relationship. Intuitively it feels like there is “more” relationship between a married couple than between two acquaintances, or more relationship between a bee and a flower than between a bee and a badger, but no mathematical system exists to turn that vague sense into a discrete measurement.
I have a proposal for one such system which I will share shortly, but am personally much more intrigued by this unanswered question than by my early attempts to explore it. If we could treat relationships not as objects with a poetic similarity but as systems with fundamentally consistent and understandable properties it might transform how we think about everything from protecting ecosystems (made up of relationships among organisms) to growing businesses (made up of relationships between employees, customers, etc) to organizing classrooms (made up of relationships between teachers, students, parents, etc). To explore how this might be possible, let’s return to Claude Shannon.
At its core, Shannon’s innovation is a system of mathematical analysis, a sort of box with a crank on the side that could take in anything in the universe and spit out a series of bits. This innovation hinges on a fundamental mathematical consistency that exists between all kinds of information. Our task is to find a similar mathematical consistency that exists between all relationships. To find it, we will need to build on Shannon’s work by asking not just what a message is, but why that message matters.
Imagine you’ve just moved to a new city. You’re standing in the bus station with some saved up cash but no job, no apartment, no friends, no relationships of any kind. I walk up to you and I hand you a map of the city and three crayons: one red, one yellow, and one blue. I say, “Think about where you’re likely to be next week. Color the locations where you’re likely to spend time in red, the locations which you might or might not spend time in yellow, and the locations where you’re unlikely to spend time in blue.”
At first the entire city is yellow. Without any relationships you are in a state of extreme uncertainty, what a physicist might call a state of high entropy. You wander the streets aimlessly absorbing information, until something catches your eye. A help wanted sign at a cafe. You walk in and talk your way into a job. One of your new coworkers mentions that she knows of a place you can stay while you get on your feet, and suddenly you have a roof over your head. Your yellow map suddenly has two red dots on it.
You settle in, you make friends, you start dating. Your map begins to resolve around a few red circles that represent the relationships that define your life, and many of the remaining yellow areas begin to fade into blue. Some yellow still exists, and the red shifts from time to time, but your map reaches a sort of uneven equilibrium. When a relationship in your life shifts, the map tends to shift with it. Small changes in relationships will create a small shift, just a little patch of yellow where one wasn’t before, while big shifts will show up in red.
One interesting way to measure a relationship would be to measure how much it shifts that map over time. In mathematical terms, we could measure the change in the probability distribution of your location or the change in the entropy of that distribution. We’d get an interesting picture of your life, but location doesn’t tell the whole story. The map of the city isn’t the only map that matters, there’s also the map of how you feel and the map of what you do with your time and the map of how you spend your money and so on. Imagine we took all of those maps, every possible dimension of how you might exist as a collection of subatomic particles, and we called it your “state.” Now imagine we somehow measured the change in that.
What we’d see is a complex waveform that tells a story not unlike the story of that first map I handed you at the bus station. You walk out of the bus station, and information starts flowing at you from all directions, but most of it has no impact. Then you see that “Help Wanted” sign, and your probability distribution shifts. This information has value, it reduces your entropy, and so you explore it further. More information is exchanged, more uncertainty is resolved, and you wind up in a state that’s a little more stable that the one you were in before.
Every time you receive information from your workplace, whether it’s a text message, a conversation or a paycheck, that information effects your state. Mathematically it increases or reduces your entropy by a quantifiable amount. Low-value information (smalltalk with a coworker) will have little impact, while high value information (someone saying that they love you) will have a significant impact. It’s important to recognize that the impact of a message (how it increases or decreases your entropy) may have nothing to the complexity of a message (how many bits it takes to communicate). If I wanted to measure the value of a relationship, all I would have to do is look at all of the information conveyed in that relationship and sum up its impact:
Where R is the value of a relationship, n is the number of messages exchanged in that relationship, and S(m_i) is the change in entropy from a particular message exchanged in that relationship. If we put the sticky problem of measurement to the side for the moment, this equation can give us some nice versatility. We could use this equation to measure the relationship between you and your new job, but it could be just as easily used to quantify any system consisting of stuff with a probability distributions of states who modify that state by exchanging information. Atoms fall into that category when they exchange electrons and change their position, or cells in an organism when they exchange hormones and change their behavior, or plants in a forest who use color and scent to attract pollinators, and so on. Individually breaking these relationships down into a common mathematical form may not initially be that interesting, just as breaking literature into bits isn’t necessarily a meaningful way to understand great literature, but having a common mathematical language for relationships can allow may allow us to see patterns that would otherwise be difficult to detect. It is possible that some of what we scientifically observe about the world isn’t a property of atoms or human psychology or animals, but a mathematical property of relationships themselves.
To explore whether these properties exist, we can construct mathematical models in which objects with a range of states exchange information and look for patterns. If we can prove with mathematical certainty that, say, systems with property X are always better at forming relationships than systems with property Y, then we can observe any place where properties X and Y hold in nature and see if the pattern is true. We could begin to understand what causes relationships to develop, and we could begin to get smarter about building environments where they thrive. Right now when someone raises their hand at a PTA meeting and says that they want to “build community” they engage in a haphazard process of intuitive guesswork. Imagine instead that you could build community by relying on a series of reliable universal laws, verified by experimentation across a range of scientific fields. As with our ability to engineer information, the ability to engineer relationships might reach out and touch our culture, economy, and politics in powerful and unexpected ways.
I have barely begun the process of constructing and exploring the mathematical models described above, and unfortunately do not yet have conclusive results. I’ll share a few observations and hunches that may be relevant to those who also find these questions intriguing:
- It appears as though the mathematical system of “stuff with probability distributions of state that exchange information” has never really been formally explored. I’ve gotten advice from experts in fields like graph theory and theoretical physics, and they all seem to think this sort of work is novel and interesting.
- There are a few promising paths to the sort of properties described above, and as well as a few obvious and trivial cases. For example, conversations where one party has a is focused on a transactional goal (such as selling something) may be intrinsically less “relational” than conversations where both parties are open to exploring multiple paths of conversation. With some more tools for making formal statements about relational networks it may be possible to start defining their properties and observing where those properties hold in nature.
- It looks like it will also be possible to formally explore questions about relationships and prediction. Some components of relational systems are like the weather, too chaotic to predict with certainty, while others seem to follow predictable rules. It may be mathematically possible to start defining what about relationships can be predicted and what can’t.
- Even if we can’t measure the “state” of a flesh and blood human, we may be able to measure the intrinsic uncertainty in the things that we can measure. Measuring who someone text messages may tell us more about their relationships than measuring who they follow on Twitter, and we may be able to mathematically represent exactly how much more.
I’m still early in this work and would love advice from anyone with sufficient mathematical training, but wanted to share the deeper questions that it raises. If you’re interested in reading a bit more about the math (and seeing some cool visualizations), check out http://edgeless.network. It’s possible (even probable) that the particular mathematical tree I’m barking down won’t bear fruit, but the notion that some formal, universal system may exist for analyzing relationships will hopefully be interesting to those smarter and more qualified than me. And if someone can conclusively prove that no such universal system exists, well that’s interesting too.
Connection is a thing that all of us struggle with, feelings of loneliness and confusion about our relationships have been with us since the dawn of our species. Those things won’t ever go away. But if the math works out there’s a chance that we may be headed towards a world where the tools of connection get radically better than the tools we have today, and that’s a world that I, personally, am interested in living in.