The Exercise That Set Me In Motion: Base Conversion

Ruth Hill
Ruth Hill
Mar 1, 2018 · 4 min read

When I first set about learning to code in earnest in 2016, I had done enough at that point to know that I was interested in programming, but I hadn’t yet had a “Eureka!” moment. My daily practice involved reading books and articles on languages or coding concepts and working on practice problems in Ruby, the first language I pursued. I explored things as I had the time and mind to. One day I came across an exercise that instructed me to “Build A Base Converter” and I distinctly recall my eyebrows furrowing as the nagging doubt crossed my mind as to whether or not I was really cut out for this. I had no idea what a “base” even was.

After reminding myself that, no, I shouldn’t just assume that an unfamiliar concept is beyond me but rather give myself a chance to try to understand it (hey, I was still pretty new to this at the time), I read through the problem. It went something like this:

“Write a method that takes two integers - the first representing a decimal number to be converted, and the second representing a numeric base between 2 and 16. The method should convert the decimal number into the new base and then return it as a string.”

I set to work Googling what a numeric base was. I learned that “decimal” and “base-ten” were synonymous, as were “binary” and “base-two.” One bit of imagery that struck a chord was that humans have ten fingers (digits, anyone?) total. Wikipedia makes mention of various cultures that utilized different bases for their numeric systems, such as base-20 (counting all fingers and toes) and base-8 (or octal, counting the spaces between the fingers).

This made me realize the extent to which I take for granted the decimal number system. I had learned to count using the digits 0 through 9. That was all I knew, and without having been exposed to any other systems to compare it against it’s difficult to think meaningfully about the structure of the system. So you take it for granted — that’s just how counting works, right?

Image for post
Image for post

I got out my notebook and started writing down example numbers, looking for the relationship between the digits, and calculating manually how to convert a decimal 4 into it’s binary equivalent, 100. After 4 I tried 11, 12, and 673. Then I tried writing a Ruby method on my computer that would do just that, decimal to binary. (I was very excited because this was a perfect situation for my favorite mathematical operator, the modulo. I’m still not sure why I enjoy the modulo quite so much.)

My process led me to divide the starting number by two until it itself was less than two, and keep track of the remainders. Then, I needed to join those remainders together making sure that their order was correct (and making sure they were a string). By the time I had the decimal-to-binary converter working I was totally consumed. My method could easily be tweaked to accommodate any other base that was less than ten:

def num_to_str(num, base)
if num < base
return num
end
remainders = []
until num < base
remainders << num % base
num /= base
if num < base
remainders << num
end
end
new_number = []
remainders.each do |digit|
new_number.unshift(digit)
end
return new_number.join("")
end

I still had to figure out how to handle the larger bases. Once you go above base-ten with this method, you start to repeat yourself — in base-eleven, if you only have available to you the digits 0 through 9, the number ten is 10, but the number eleven is also expressed as 10. Learning that I needed to express those greater numbers using letters (or other symbols) was the second particularly eye-opening moment because it caused me to think in a new way about how we represent numbers with writing.

Because many cultures rely on a base-ten number system, the only symbols that exist to represent numbers are 0 through 9. So when we need to switch to a base greater than ten we have to find a way to represent the additional digits. Instead of creating brand new symbols for these digits, we could simply reuse some that already exist — the alphabet. Although we usually use it to represent human language, the alphabet has twenty-six handy symbols, fifty-two if we account for both lower and upper case, so it can take us a long way here (but it would undoubtedly be extremely difficult to parse as a human if we were to use very many of them).

  int_str_hash = Hash.new(0)  (0..9).each do |x|
int_str_hash[x] = x.to_s
end
int_str_hash[10] = "A"
int_str_hash[11] = "B"
int_str_hash[12] = "C"
int_str_hash[13] = "D"
int_str_hash[14] = "E"
int_str_hash[15] = "F"

My solution was to store the numerals and letters together in a hash table and alter the unshift method inside my .each do statement so that it used values from that hash table instead of from my remainders array.

I still think this is a terrific exercise and I’m overdue for a repeat attempt and some optimization, perhaps this time in JavaScript. This problem was instrumental in building my confidence as a programmer early on — not only because I learned a great deal in solving it, but also because in working through it I proved to myself that a difficult or unfamiliar problem is no match for some brain power & determination.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch

Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore

Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store