TECH OVERVIEW

6 Important Programming Languages and Their Original Purpose

What were the popular languages of the past and present created for?

Handmade Software
Published in
11 min readJan 27, 2021

--

The history of programming already counts seven decades, and diverse languages came and went. Each of them was created for some purpose, all of them aimed to solve the problems of technologies from the previous generation. Check out the brief history of six programming languages and their original purposes, and where the development has led them.

Fortran

One of the oldest programming languages out there is Fortran (FORmula TRANslation). It was used by NASA, created by IBM, and if you are not a stranger to data science, you would probably know the prominent scientific packages that have been in development for decades. The grandpa of programming, Fortran, had his ups and downs, but since its first release in 1957, it has suffered multiple major standards, and the last is from 2018. The older standards remind later versions of assembler (maybe it’s just to me). Check out my super old code from the uni:

Polynomial spline calculation with Fortran

Fortran 77 had some odd features, like implicit typing from the variable name: if the variable is one of the common iterators in math i, j, k, etc., they are considered integers and float otherwise. This caused much of a headache back then when I was using it. Programming started as a purely mathematical discipline; computers were a luxury device, costing a fortune (:D) and available only to the scientists and engineers in the space industry.

At the times when Fortran was created, there were no many alternatives. Unlike today when there is a programming language for anything, Fortran was dominating computer science for a very long time.

The goal for creating Fortran back then was to get rid of hardly comprehensible assembler, which was a real impediment for creating large programs needed in various areas. On the other side mathematicians had trouble understanding it, because well, the assembler is how the machine thinks. Check out one more uni program of mine for comparison:

An assembler program for reverting a string of symbols

Quite wordy, huh? And this is how much code you need just to invert a string. Imagine how much code you would need to send a shuttle in space! No wonder they needed something better.

Fortran introduced some concepts without which programming seems impossible today: IFs instead of goto, addressing memory instead of moving index within a continuous area of memory, input/output routines instead of interruptions, and much more.

Fortran was a revolution, but its main purpose was and is effective programming of mathematical procedures. Some of the libraries have been out there for a quite while, like LAPACK and BLAS, which are now lie in the foundation of the modern computational packages like NumPy in python. Of course, there are many outstanding alternatives, but Fortran now has its standing as a super effective tool for drilling monstrous calculations.

Pascal

Named after the French mathematician and physicist Blaise Pascal is the programming language created by Niklaus Wirth, the swiss programmer in the early 70s. Widely spread ALGOL back then, which is, in fact, an inspiration for Pascal, was not good enough, and he decided to create his own ALGOL. Of course, with his blackjack and hookers.

Historical speech of Niklaus Wirth in Zurich, 1970

Niklaus had some bottom explosions because ALGOL overhyped developers didn’t want to merge his proposals. Those proposals became Pascal. One more example of when a single developer’s efforts stayed longer on stage, as the original wannados.

From the very beginning, Pascal was supposed to motivate the young programmers to use safe programming techniques, write easy readable and comprehensible code and therefore was a very good fit for introducing the basic concepts of programming to the students. The main goal of Pascal is to teach.

Pascal has become a de facto standard for teaching programming in the 80s. Although I wasn’t really going to the uni in the 80s, Soviet Russia had a certain lag in adapting the western technologies. My first program was indeed in Pascal.

I hated and loved Pascal, as I hated and loved MySQL.

Check out Pascal look and feel:

Checking if a number is positive or negative in Pascal

C

Ken Thompson, Dennis Ritchie (the creator). You can tell they have invented a programming language.

Another ALGOL-inspired assembler fighter, which was created by Dennis Ritchie, an American programmer, in 1972. Back then, Linux was not even in a plan, and no operational systems were clearly dominating the market as we know it today. Computers were absolutely exotic for the common folks and exclusive to those understanding the secret silicon taming techniques. The computational power available to the engineers back then was tiny compared to today's cheapest IoT energy effective all-in-one computers.

So what was needed is a language effectively compiling into the machine code, but still understandable for human beings. Besides that, in the world of commercial processors having different command sets and architectures, there was no easy way to write a program for multiple platforms at once. Imagine every person you meet on the street would speak a different language, this is what the world of computing was back then. The goal of creating the C language was portability and effectiveness.

While C unified the fragmented technological field of processors back then, many of its concepts have become a real struggle in the future. Addressing the n+1 element of the array made the binary code injections possible so that the malicious code can be executed in the context of the attacked process. Almost any escalation of privilege uses this flaw to exploit the remote program.

Check out some code in C:

Adding two numbers, https://www.programiz.com/c-programming/examples/add-numbers

Ugly, isn’t it? In my very subjective opinion, semicolon for statement termination and curly braces for code blocks makes large programs particularly unreadable. Many don’t seem to share that opinion, so many other languages have inherited that syntax: Java, Objective C, C++, JavaScript, PHP. The purpose of that syntax was an easy translation in machine code, not readability. Yet, here we are: The ugliest syntax in programming history has become the most popular one.

C++

In 1989, Bjarne Stroustrup decided to enrich the C language with object-oriented programming features. Offered by MIT and Alan Kay, object-oriented programming was different from imperative and procedural programming, the way the execution was perceived. If an imperative program reads like a cooking recipe, an object-oriented program operates with objects, attributes, and possible actions, so it’s more like a play or law text with a very long preamble.

I invented the term Object-Oriented and I can tell you I did not have C++ in mind (Alan Kay, the apostole bad ass of computer science)

Regardless of Alan Kay's disapproval, C++ has become dominant in commercial programming for decades, the amount of code written in this language is humongous. In this form, any modern computational system more complex than a zip tie has some parts of it ẇriten in C++. Desktop applications, games, server software, embedded systems, smartphone software, operational systems, scientific programming — the list of C++ applications is indefinitely long.

With its outstanding performance, C could not offer the expressive power of object-oriented programming, so when such a tool arrived, it was doomed to conquer all planes of programming oblivion.

And then I took an arrow to the knee. Wait, wrong elder scrolls soré!

The code looks pretty much the same, but with classes. I’m out of curly braces and semicolons, so no code samples. JK! A quadrangle with C++.

Some more ugly code from the uni

More ugly syntax, this time forever and everywhere.

PHP

Back in the year 1994, almost the only way to host content was static HTML pages. Big players like C++ were not really favored because of webpages' dynamic nature, constantly changing content and exotic execution patterns.

Compiled languages are associated with turning a source into an executable binary. The executable then was not changeable afterward and was distributed in that form to the end-user to execute them on their local machine. This is how desktop applications nowadays work.

After HTTP and the world wide web were available to the end-user, the new concept of spreading the content was introduced. Here the execution as such was split into 2 parts: the remote part and the local part. On the remote computer, a special program called a web server read the textual file and transmitted it over the network to the local machine. In its turn, the local machine executed another program, reading the instructions inside the textual file, commanding it to place text in certain places of the screen, lay it out, and underline the connections to other textual files on the network.

This concept was called hyperlinking, connecting the documents and creating a web. The documents were then called HyperText, and the formal description of that communication — a protocol — was called a HyperText Transfer Protocol or HTTP. The network itself was supposed to connect the whole world. It was called the World Wide Web or WWW; those abbreviations are well known today.

Hosting static HTML was a pain in the ass because, unlike the local executables, webpages needed constant modification. What was needed is something we would call a template language today: inserting data from the database, iterating, conditional statements, variables, and more.

In 1994 Rasmus Lerdorf created CGI →common gateway interface. The idea was the following: if there is a browser capable of rendering the static HTML files, let’s pretend we have those files saved, but in fact, generate them on the fly. Rasmus used it for his homepage, so here we go again, laziness has lit an absolutely new technology.

This is why it’s called PHP: Personal Home Page interpreter. The creator has never planned it to be a programming language, so through its entire lifetime PHP suffered under lack of conception.

The language functionality was quite rudimentary in comparison to the modern PHP or other web-related languages. In 1995 it was enriched with a couple of necessary things, like $variables from Perl.

Let’s take a look at the early PHP code example from Wikipedia:

Early PHP code sample: SQL right in the template caused quite many security issues

Compare how it would approximately look in Django template language nowadays:

Modern template language equivalent. You won’t find raw SQL in templates nowadays, though.

With years PHP syntax slid towards C, and it didn’t paint it. Remember, the goal of committing to that ugly syntax was for the sake of performance and quick compilation. Why would you want to adapt it to an interpreted language used for web applications is a riddle to me. I have a hypothesis though, let Rasmus answer himself:

I don’t know how to stop it, there was never any intent to write a programming language […] I have absolutely no idea how to write a programming language, I just kept adding the next logical step on the way. (Rasmus Lerdorf)

All of that did not prevent PHP from dominating web programming for two decades until the amount of shitty code written in it has arrived at the critical mass. Like a crappy pop song, it was played everywhere.

PHP5 introduced to object-oriented programming, but the amount of legacy code was so humongous that the community has struggled for years to make that transition. Php has gradually shifted towards the old sad thing your grandpa was using to entertain his geek fellows. Today the undeniable leader of the past is at place 8 of the TIOBE index (the year 2021), and it has lost its leadership to… Python. Who could’ve thought?

It’s time to learn Python, bebe.

JAVA

One more language that picked up questionable C syntax was Java. Although compiled languages were significantly faster because they were translated into a machine code directly, those programs lacked portability.

In 1991 James Gosling, Patrick Naughton, and Mike Sheridan started developing the first predecessor of Java, and five years later, in 1996, the first commercial version of it was released. Java has introduced a new concept in programming — virtual machine.

Instead of compiling and building the source code into an executable, as was the case with C++, Java used a middleware software, an executable native to the architecture it was running on. This software read preprocessed Java code called bytecode and then executed it as a processing architecture itself. Besides that, the virtual machine has taken several important side tasks, like memory management and security, showing a clear advantage compared to compiled languages spearing the headaches to developers.

Although Java originally pointed to embedded systems and has concurred that landscape, its applications have far beyond that.

“Write once, run everywhere” has changed everything. Java has occupied a multitude of engineering areas and, in the first place, corporate software and web development. During its long history, Java has lived over dozens of releases, has changed owners, and nowadays symbolizes the monstrous and obese world of corporate programming. The concept of a virtual machine is a standard part of interpreted programming languages like Python and Javascript nowadays.

Finding out the ASCII source of the character. Even wordier than assembler. Source: https://www.programiz.com/java-programming/examples/ascii-value-character

Java is the following step in splitting the program from its execution environment. The virtual machine has decoupled the code from the hardware's bothering details, so the actual programming can concentrate on solving the task itself and not juggle the local computational resources. In web programming, where most of the execution happens in awaiting I/O, and computational performance is not so important, this was an enormous benefit.

When Google started its own competitor to iPhone in 2008, it decided to take Java as a programming language for its new Android platform. After the start, it became quite clear that standard implementation of the Java virtual machine suffers from unpredictable performance drawdowns, caused, among other things, by spontaneous garbage collecting. Similar to Facebook but with much bigger success, Google has decided to touch the holy cow of Java world and create their own Java machine suitable for resource critical real-time end-user applications, which are commonly known as “apps.” Android apps, to be certain.

On today's standard, Java is an unsupportable monster. The coding concept is too inflexible and dictatorial; Java loses its positions to easily usable interpreted languages. In an attempt to overcome the crisis, multiple languages running on Java virtual machine, but using a different syntax and including modern dynamic functions like functional programming, well known are Scala and Kotlin. Because they translate into Java bytecode and can be executed by the Java virtual machine, existing Java libraries can be injected and used as Scala or Kotlin native code. This is a huge benefit having an enormous amount of code inherited from Java’s glory days.

Java is the most distressing thing to hit computing since MS-DOS. (Alan Kay, I love that guy)

The constant decline of Java rating over the last two decades in comparison to Python
Thanks for reading. Subscribe for more interesting content!

Also, check out my recent articles:

🍑 Fullstack my backend: why full stack developer is a myth

🕒 Why your software quality degrades with time: short story

🐍 RapidAPI: first steps with Python: a comprehensive guide

--

--