ASCII table. The explanation that I wish somebody gave me before

Aidos
9 min readMar 12, 2023

--

ascii-table
Click on image to zoom

Technical articles and official documentation sometimes might be very limited and dry, which is not a good ground for clear and easy understanding. In this article I will give you the explanation of ASCII table that I wish I had when I had hard times with understanding what is that and how to use it.

If you are interested in ASCII I very doubt that you need it for cooking or gym. Most likely you are the one who makes first steps in computer science and programming. Then why don’t we try to explore this topic as scientists?

But please, don’t be scared) This doesn’t require from you any scientific background. We just going to be little more curious and dig little deeper. But we don’t want to dig only for numbers and all this scary computer words. As I promised this article is not going to be dry and boring. Instead, we will discover the beauty of silicon world as easy and exiting as reading manga.

First thing that you know for sure, but I just want to remind you and emphasis that computers actually don’t understand human language. Moreover, they can’t understand anything. They don’t have an ability of understanding. Computer is just a machine made of metal and silicon connected with wires. It’s parts communicate between each other with electrical signals. One part sends command as a signal to other part to do some operation.

But as you can imagine communication with electrical signals language have very poor choice of symbols. All they can do is send an electrical signal or not to send it. To be more specific, computer language have only 2 symbols in a language:

  1. an electrical signal;

2. nothing (an absence of a signal);

It is very similar to Morse Code. Probably you have seen how soldiers in movies send messages via Morse Code using knocking or torch light. Morse code also has only 2 symbols: short and long.

morse-code

Very long time ago computer scientists agreed on the notion of these only two symbols in computer language. It was decided to express them as 1 and 0.

An electrical signal is expressed as 1

An absence of signal is expressed as 0

The system of data representation with only ones and zeros is called “binary”.

binary-apartment-number

If we want a computer to perform some operation or work we need to give it an instruction on what we want it to do. And as far as binary is the only language it can “understand” it has to be written in binary.

binary-heart-futurama

But for humans it is little inconvenient to write instructions for a computer in binary. It is much easier to write on a language that we got used to use, a human language with human symbols.

So people invented and developed such thing as “compiler”. Compiler is a translator from human readable programming language to computer readable language (binary) / machine code.

hello-translated-to-binary

We will not cover the binary system in detail as it is a topic for a separate article. For now it is enough to know that it exists and does its job.

Every human language character has its binary representation. For example, we can start with simple decimal numbers.

decimal-to-binary

As you can see, the picture above reminds a dictionary. Usually in dictionaries there are words in left side and their translation to another language on the right side.

The table above is also some kind of a dictionary. Numbers represented in a human language on the left side and their translation to machine code on the right side.

For translation, a compiler does some math operations converting decimals which are in base 10 to binary (base 2). Therefore, with decimal numbers, actually, there is no need in a “dictionary”. Math operations do the conversion.

Looks very straightforward. Isn’t it?

But, we need something more than just numbers. How about letters? We certainly need them to write texts and send messages.

Letters are translated to machine code to the same binary system, they are also expressed with some combinations of 1 and 0.

Wait a minute! It is impossible to do math operations on letters to convert them to binary! Letters are not numbers…

Very long time ago computer scientists had the same questions. They decided to add one more step of conversion.

All of us have our own personal ID card, and there is an ID number that is unique. Also, we have our home addresses, which have an apartment number.

Engineers decided to apply a similar approach. They assigned “personal ID” numbers to each letter and other non-numeric symbols. Now we can give to compiler not non-numeric value but its numeric (decimal) representation.

letter-decimal-binary

Waaaaait a minute! If number 65 is assigned to letter A. Then how we will give to a compiler actual number 65 itself?! That is so odd and confusing. How we can specify to a compiler when 65 is letter A, and when number 65 is just actual number 65?

To solve this problem, we pass to compiler the value of 65 differently.

In programming each piece of data has such quality as a “data type”.

To understand this better let’s discuss an example from real world.

two-tomatoes

There are two tomatoes in the picture above. They are both tomatoes. But each tomato has its type and qualities.

The one on the left side is a toy tomato, it is made of plastic and you can not eat it (at least I highly not recommend you to do that), but it can be used as a toy for kids.

Another one on the right side is a real tomato and you can eat it.

It may seem obvious and easy. But what we don’t realize is that when we look at these two tomatoes a brain does very quick analysis. It immediately makes several conclusions:

tomatoes-analysis

In this example, what quality helps the brain to make a decision what you can do with given tomatoes? Their types. Depending on a type you decide what to do with a tomato, play with it or eat it.

Computers designed to use the same approach when dealing with different pieces of data. It make very fast analysis:

  1. What data I’ve received?
  2. What is the type of this data?
  3. Can I do math operations with this type of data?

Now let’s switch back to the question with number 65 and letter A. How computer can understand what do you want number 65 to be: a letter A or just number 65?

We can clarify this for a computer the same way as with tomatoes. The difference is provided by data type. And depending on a type a computer will understand what it can do with this data.

If we assign to 65 a data type — integer. A computer will display it us as number 65. And it will be possible to carry out math operations with it as it is an integer.

On other hand, if we assign to 65 a data type — character (char / rune). A computer will treat it as a symbol or letter and will display to us as a letter A. Also, it will be impossible to do math operations as it is a letter. But we can do math operations with its decimal representation which is 65 (kind of ID for the letter, we discussed it above).

data-type-of-65

Sometimes, we use numbers in context of text messages, where we don’t need to do calculations. For instance, when we write an address or phone number.

In these cases numbers are represented as part of text, therefore for simplicity and to avoid mixing different data types, we assign to numbers data type — character. But in order to avoid conflicts with letters decimal representation and not to convert them to letters. It was decided to assign to each number from 0 to 9 their own decimal representation, so they can safely be displayed within a text, have data type — character and be treated as letters (symbols). They became some kind of additional letters in the alphabet.

number-as-text

In the picture above decimals 54 and 53 became a not a number, but a text “65” because we assigned them data type — character.

In fact, when we write 54, 53 together (separated with a comma) and then want to combine them together and form a text consisting of multiple characters, it is called an array of characters (a set of characters). And the text result consisting of this characters is called a string, which is also another data type. It is not really important for this article, but I mentioned it to prevent any future confusions if you will use this article when learning how to do manipulations with characters, arrays of characters or strings.

As mentioned above every letter and text representation of numbers have their own “ID numbers”, I called them ID numbers here just for simplicity. In fact it is a decimal representation which can be converted to binary (machine) code.

Where we can actually see list of decimal representations for all symbols?

They all are specified in the special table which is called — ASCII.

ascii-table

ASCII stands for American Standard Code for Information Interchange.

Now we know that decimal representations are used to convert human symbols to binary code. But what else can be done with numeric representation of characters?

A lot of interesting manipulations can be carried out with ASCII table.

For example, in many programming languages there are special functions that can change a letter case. In most cases we don’t see how these functions are implemented under the hood. But they do use ASCII.

Most likely, you will not have to do this manually. But just for clearer understanding let’s think what algorithm can be applied to achieve a letter case change.

As we already know a numeric representation of capital letter A is 65. If we want to get lowercase version of ‘A’, which is ‘a’, we need to have a look at ASCII table and find decimal representation of lowercase ‘a’. It is — 97.

If in our text we already have ‘A’ with decimal representation — 65, the only thing we need to do is just to address to another numeric representation.

To change one numeric value to another one all we need is just one math operation:

65 + 32 = 97

Now 65 is changed to 97. So we give to compiler 97 with attached data type — character and computer prints for us lowercase ‘a’.

letter-case-change

This is very simple example of manipulation with ASCII. With your further studied you will discover some other interesting cases.

The goal of this article is just to explain to you the general concept of ASCII. That’s why I intentionally simplified everything. If you want to learn about all features and history of ASCII, you can find more detailed information in the internet (for example here). Also, I believe you will learn more about different data types when you will study computer science and programming on a programming language of your choice.

Thank you a lot for reading my article! I really hope it helped you to understand ASCII. If you have any questions, write them down it the comments, let’s discuss them together) Please click the like and share it if it was helpful. And subscribe for more articles on computer science ;)

I am always open for new connections and friends, so please find me on github and linkedIn )

https://www.linkedin.com/in/zhapbassov/

--

--