Out of all our languages, why did we only give computers 2 digits?

A brief intro to why binary exists and is likely here to stay

Bassel Ghazali
Technology Simplified
3 min readFeb 28, 2023

--

Photo by Jon Tyson on Unsplash

Everything you’ve ever seen on the internet is just a combination of 0’s and 1’s. That’s the language of computers.

Using chains of these 0’s and 1’s, phones and computers can represent all kinds of things, like numbers, words, pictures, videos and even sound.

It seems strange; we have so many different options for communication, yet we only chose two very simple digits to power the smartest machines on the planet.

What an alternative to binary would look like

Technically, computers “speak” in electrical signals passing through their components. We use those signals to represent things like numbers. In a binary system using 0 and 1 only, a computer signal can be high (representing 1) or low (representing 0).

9 in binary is 1001; we need 4 digits to represent something we normally represent using 1. That’s 4x the storage space needed on a computer.

It seems like a pretty restrictive language, so why don’t we add more digits?

Instead of only looking at whether there is a signal or not, why don’t we also look at the strength of the signal? If we agree that the strongest signal coming out of the computer represents 9, and no signal represents 0, we can now vary the strength of the signal between those two values to represent 1, 2, 3 and so on.

We can now use one signal at max power to represent the digit 9, instead of needing 1 high signal then 2 low signals and then 1 high again.

More unique digits to use = fewer digits needed to represent something.

There are problems with anything not binary

The core technology doesn’t exist

Existing computer processors use transistors. Transistors are simple switches, which either allow electricity to pass through or not pass through. By doing this, transistors act as controllers for a binary signal. When the signal passes through, we get an output of 1, and when the signal is blocked, we get 0.

At the time of writing, transistors can be 4 nanometers small. That’s 18,750,000 times smaller than a strand of hair. A computer chip the size of a fingertip contains billions of transistors, which means the chip can perform tons of calculations at the same time.

A machine which can handle varying signal strengths can’t use transistors; it would need devices which can not only block or allow signals but also control the signal strength.

Since these devices have more digits, they operate more efficiently than a binary computer, but will still need to be able to function at very high frequencies; existing computers process signals billions of times per second.

The technology that can perform these tasks is non-existent and would be very expensive to develop.

Will it ever happen?

A few computers were experimentally developed to use 3 digits instead of 2. These Ternary computers never took off because there was always a tradeoff; the tech which was used to make them was quite slow, and it defeated the purpose of the computer.

The binary system is just too well-established at this point. There are areas of investment and development in the binary processing field which are almost guaranteed to improve computer performance, so there isn’t much incentive to invest in the uncertain tech for ternary (or greater) computers.

This doesn’t mean it will never happen, I just don’t see it happening any time soon.

Thanks for making it this far! If you enjoyed the article, consider following me to have quicker access to articles which break technology down into digestible pieces.

--

--

Bassel Ghazali
Technology Simplified

Techie, photographer, digital artist and nature-lover. I write about these areas, sometimes individually, sometimes together.