A brief look into HOW!? (code and binary)

Jon SY Chan
3 min readFeb 5, 2019

--

I am a person that always asks, how? I find it hard to continue learning on a topic unless I know as much of the foundation as possible. When it comes to computer programming there are already so many layers of abstraction that it’s hard to get any work done if you keep looking into how something works. However, I still always take a peak and look out how it all works because I find it very hard to accept “magic”.

The only way I can understand all the little interactions or errors is by knowing how things work fundamentally. So I did a bit of research on how my code, “puts Hello World” is actually put out as Hello World by my computer.

So far I’ve been learning and coding in the high-level language, Ruby. However our computers read code through Binary or 1s and 0s. I’ll get into what I learned about why we use binary later. Our high-level language was made for us. It was made so that we could easily write code that makes sense to us and this is closest to the way we really communicate. Our code is actually sent to a compiler or interpreter that translates our code into assembly code (1s and 0s) through tokenizing, lexing, parsing and compiling that our computer can actually understand.

In computer technology, a parser is a program, usually part of a compiler, that receives input in the form of sequential source program instructions, interactive online commands, markup tags, or some other defined interface and breaks them up into parts (for example, the nouns (objects), verbs (methods), and their attributes or options) that can then be managed by other programming (for example, other components in a compiler). A parser may also check to see that all input has been provided that is necessary. — https://searchmicroservices.techtarget.com/definition/parser

https://www.guru99.com/difference-compiler-vs-interpreter.html

I still don’t know the exact details on how this works but will look into it soon. I ended up diving into why computers use binary. And from my understanding this is because we need a physical representation of data. This physical representation is voltage. There are two states 0 and 1 or low voltage and high voltage. Now two numbers may seem like it can’t represent a lot of data, but if you put them together you can actually represent limitless data as long as you have enough wires and transistors.

https://sites.google.com/site/syhsmata/creative-projects/binary-numbers

One of these squares above is called a bit and 8 of them together is called a byte. A byte can represent 256 numbers or things. Since people wanted to represent even more things they put more of these together to represent a larger amount of numbers.

This is evident through ASCII.

It was designed in the early 60’s, as a standard character set for computers and electronic devices. ASCII is a 7-bit character set containing 128 characters. It contains the numbers from 0–9, the upper and lower case English letters from A to Z, and some special characters. — https://www.w3schools.com/charsets/ref_html_ascii.asp

http://web.alfredstate.edu/faculty/weimandn/miscellaneous/ascii/ascii_index.html

Now this will go into how to store data, which is done through putting together clever logic gates and how this continues to go up through the layers of abstractions.

In computing, an abstraction layer or abstraction level is a way of hiding the working details of a subsystem, allowing the separation of concerns to facilitate interoperability and platform independence. — https://en.wikipedia.org/wiki/Abstraction_layer

--

--