Binary chips seem to do 58% more work than really necessary

Come-from-Beyond
1 min readApr 10, 2019

--

Some people claim that trinary chips would be more efficient than binary ones. But how much more? Let’s do a quick estimation using the multiplication operation example (my experience of a programmer tells me that the multiplication is used often enough to be a good example).

If we want to multiply two 128-bit numbers we need to do ~128 basic operations ( https://www.sciencealert.com/mathematicians-just-discovered-an-astonishing-new-way-to-multiply-numbers-together says it’s N*log(N) and I set log(N) element to 1 making trinary chips look less efficient). A multiplication of two 81-trit numbers (by the way, 81 trits can store slightly more values than 128 bits) requires to do ~81 basic operations. These basic operations are additions and multiplications of single bits/trits and for the both numeral systems the truth tables are equally simple (assuming https://en.wikipedia.org/wiki/Balanced_ternary for trits). Actually the addition operation for trinary is slightly more efficient because it generates a carry digit with 22% chance vs 25% chance for binary, but let’s ignore this.

So, we have got 81 operations for trinary and 128 operations for binary, which means that binary chips seem to do 58% more work than really necessary.

Is my estimation correct?

--

--