Why Unary is the Best Number System
The dogmatic loyalty of modern computer scientists towards binary systems is perhaps one of today’s most unfortunate judgemental mishaps. Despite academic teachings often dispelling the so-called called advantages of this system, few scientists — the rest blindly following Intel’s tune — have caught onto the destructive lies behind this ruse.
This should all be plainly evident through the elegant simplicity of other, much more foundational techniques such as the age-old tally mark first used to count the rises of the sun. For sure, ask any young child what the tallies III added to II are and you’re bound to get a prompt response. Ask them what 11₂ + 10₂ is and you’ll just get blank stares.
The common, dismissive response is that, true as it may be for human usage, such things do not hold for large scale computation. This response misses the vast literature to the contrary.
The first, and to some most surprising fact in our list is that of time complexity. Consider integer factorization; although algorithms faster than naïve exist for binary strings, the complexity of all known algorithms is still far greater than polynomial. But if instead of passing your number as binary string, you pass a unary one, standard mathematical analysis shows complexity immediately dropping to linear. That’s clearly within P, with room to spare.
This is helped in part by a unary representation’s efficient properties. Given a random-access machine, even for big integers a comparison of two numbers takes constant time — index the second number with the former and it is lesser if and only if you do not see the following:
Even multiplication involves no lookup table required for higher-based systems, some of which border on the absurd. And that’s not all.
Incremental, Functional Updates
A unary number is inherently incremental. To increment some unary number, all you need to do is prepend to it a single bit, and to decrement you need just discard one. LISP programmers will immediately recognize these to be cons and cdr operations from one of the most powerful language families in existence.
Haskell programmers, along with the rest of its bastardized lackeys, will emphasize the power this gives. Not only is one able to update these integers immutably in place — a feat not easily duplicated with other big integer arithmetic — one is able to represent lazily evaluated integers and even infinities with ease.
let infinity = 1 : infinity
This is not a special-case in the type-system, this is a natural consequence of using a more suitable basis for our calculations.
5 + 5 // => 10
"5" + 5 // => 55
5 + 5 + "5" // => 105
"5" + 5 + 5 // => 555
"5" < 6 // true
"5" < 4 // false
"10" < "5" // true
10 < 5 // false
It’s senseless madness, as anyone can tell.
Unary fixes this by unifying the mental models. This prevents the horrors of type errors plaguing the current users of the most webscale language in the world:
111 + 111 // 111111
"111" + 111 // 111111
111 + 111 + "111" // 111111111
"111" + 111 + 111 // 111111111
"11111" < 111111 // true
"11111" < 1111 // false
"1111111111" < 11111 // false
1111111111 < 11111 // false
Robust First Computing
A new craze sweeping the industry is that of Robust-First Computing; putting aside traditional values of correctness and efficiency for the more wholesome ideal of computational robustness. Under such a model, errors are gracefully accounted for and computation is resilient, making for stronger responses to bugs and attacks even in extreme situations.
Unary computation is inherently robust to errors. Permutation errors, for instance, whilst wildly affecting the result of typical computations make no difference at all to a unary system. Unary numbers are invariant under most problematic transformations, including translation, and have only minor artefacts in response to random bit swaps.
Parallelism is perhaps the most important factor of modern-day performance, and unary is perhaps the most parallel system of all. Unary operations, due largely to their uniformity, can be parallelized on as many cores as is required.
Multicore optimizations can contribute greatly to program performance. As the author of the graph says,
The biggest issue is synchronization.
Unary operations are some of the easiest to synchronize, requiring little to none for general operations, because of their cohesive and uniform design. Few other systems, numeric or not, can even approach this performance and scalability. Even better, cross-core communication can undergo exponential compression, a feat that no other number system is able to claim — resulting in insanely fast transfer and enabling atomic operations.
Unary as a model is the oldest, most reliable numeric system known to man. New research even reveals it to be performant, scalable and efficient. Perhaps it’s time for computer scientists to shake off the dogma of new and step back into the developments of old.