History of Unix and Linux: From PDP-7 to Modern Operating Systems

Shreyas Matade
7 min readJun 29, 2023

--

For many engineers and software developers, the terms Unix and Linux are commonplace. However, beneath these familiar terms lies a complex web of history, development, and variation that is often misunderstood or overlooked. A common misconception is that a software tool or utility designed for Linux will function identically across all Linux distributions or Unix variants. However, this is not always the case, and such assumptions can lead to unexpected results or technical hitches.

Understanding the history and evolution of Unix and Linux is not just an exercise in computing archaeology. It’s a valuable tool for comprehending the differences between various Unix and Linux distributions. This knowledge can help you anticipate potential compatibility issues, make informed choices about the right systems for specific applications, and navigate the diverse ecosystem of Unix-like and Linux-based systems with greater ease.

This article delves into the history of Unix and Linux, from their inception to their modern iterations. We’ll explore the origins, the key turning points, and the distinct philosophies that have shaped these operating systems and their offshoots

Origins of Unix (1960s-1970s)

The foundations of Unix were laid in the late 1960s and early 1970s at Bell Labs, a research subsidiary of AT&T. The key players in Unix’s birth were Ken Thompson, Dennis Ritchie, and others. The spark that lit the flame was a failed project called Multics (Multiplexed Information and Computing Service), which aimed to create an interactive, multi-user operating system. When the project was discontinued, Ken Thompson began experimenting on a PDP-7 minicomputer, developing a simplified version of the system. This system, initially written in assembly language, was the precursor to Unix. The leap from assembly to a more convenient and powerful language happened in 1973 when Unix was rewritten in C, a language also developed at Bell Labs by Dennis Ritchie. This was a significant move as it made Unix portable across different hardware platforms.

Spread of Unix (1980s)

In the 1980s, Unix began to gain significant traction beyond Bell Labs. It became popular within academia, government institutions, and businesses due to its efficiency, adaptability, and powerful networking capabilities. One of the major milestones in Unix’s history is the development of the Berkeley Software Distribution (BSD). Created by the University of California, Berkeley, BSD became a popular variant of Unix. However, it also became embroiled in legal disputes with AT&T over copyright issues. Meanwhile, to ensure interoperability between different Unix systems, the Portable Operating System Interface (POSIX) standard was created, defining the core interface and behavior of a Unix-style operating system.

The GNU Project and the Birth of Linux (1980s-1990s)

It’s important to note that an operating system is not just the kernel, but a whole suite of software. The kernel acts as the bridge between the hardware and software of a system, but you need many other tools to get a complete and usable operating system. This is where the GNU project and the Linux kernel come together to give us what we commonly refer to as “Linux”.

The GNU Project

Before the Linux kernel came into existence, there was the GNU project. Launched by Richard Stallman in 1983, the GNU project aimed to develop a complete Unix-like operating system composed entirely of free software. This was a part of a broader “Free Software Movement” where Stallman and others advocated for the freedom to study, distribute, create, and modify software. “Free” here stands for freedom, not price. It’s important to differentiate it from the term “open-source” which came later and has a slightly different philosophy, although both drive the development of a large body of software we use today.

By the early 1990s, the GNU project had successfully created or gathered most of the components necessary for their system: libraries, compilers, text editors, a shell, and many other utilities and tools. However, one crucial component was missing — the kernel.

The Birth of the Linux Kernel

In 1991, a computer science student from Finland named Linus Torvalds started a personal project to create a free and open-source Unix-like kernel. This was the birth of the Linux kernel. Torvalds made his kernel available to others and soon, a community of developers from around the world began contributing to its development.

Unlike other kernels available at the time, Torvalds’ kernel was monolithic, meaning all the OS services ran in the same space, making it more efficient. Importantly, it was also compatible with the software written for the GNU system.

The Marriage of GNU and Linux

Despite their independent beginnings, the free software nature of both projects allowed them to be combined. The Linux kernel was integrated with the GNU system to form a complete and fully functional operating system. This combination is technically known as GNU/Linux, but it’s often referred to simply as “Linux”, much to the chagrin of some in the free software movement.

Today, when we talk about “Linux”, we’re typically referring to this combination of the Linux kernel with the GNU system, along with other software from various sources. This is important to remember: the kernel alone does not make an OS. Linux, as we know it, is a rich ecosystem of software, communities, and philosophies, all working in concert.

The birth of Linux marked the rise of freely available, open-source operating systems that spurred technological innovation, fostering an era of unprecedented growth and evolution in the IT world.

The Rise of Linux Distributions: A Symphony of Variation (1990s-Present)

One remarkable facet of Linux lies in its vast assortment of distributions. Although all distributions utilize the Linux kernel at their core, each varies significantly in system software, libraries, package management, and configuration tools. This diversity has allowed Linux to cater to a wide range of users and use-cases, from servers and supercomputers to smartphones, desktops, and embedded devices.

Defining Linux Distributions

The term ‘Linux Distribution’ (or simply ‘distro’) refers to a complete operating system comprising the Linux kernel, GNU system software, and additional applications packaged by various organizations or individuals. Many distros also include a graphical user interface (GUI) on top of the base system, further augmenting the user experience. These distributions each offer unique philosophies, communities, ease-of-use, and stability levels.

Early Birds: Slackware and Debian

Among the earliest distributions were Slackware and Debian, both released in the 1990s. Slackware, the oldest surviving distribution, prioritizes simplicity, stability, and adherence to Unix-like design principles. Meanwhile, Debian emphasizes free software, stability, and support for multiple architectures. Both serve as foundations upon which numerous other distros were built.

The Advent of Red Hat and Commercial Distributions

Red Hat Enterprise Linux (RHEL), a commercial distribution launched in the early 2000s, brought enterprise-level support, consistency, and longevity. RHEL’s influence spread to other distros like CentOS, a community project offering RHEL’s enterprise-level stability for free, and Fedora, a community-driven distro that often serves as a testing ground for new technologies destined for RHEL.

Ubuntu: User-Friendliness Personified

Ubuntu, released in 2004 by Canonical, significantly improved Linux’s accessibility to average users. It introduced user-friendly installation, hardware support, a polished GUI, and a predictable release schedule, making Linux a viable choice for everyday computing needs.

Package Management: A Key Differentiator

Each distro has a unique package management system that simplifies the process of installing, updating, and removing software. For example, Debian and its derivatives use the Advanced Packaging Tool (APT) with .deb packages, while Red Hat and Fedora use the Yellowdog Updater, Modified (YUM) or DNF with .rpm packages.

Package management systems not only make software maintenance easier but also ensure compatibility and resolve dependencies, ensuring software functions as intended. However, these systems also explain why a software package designed for one distro may not work on another.

GNU Libraries and The Toolchain: Ensuring Compatibility

GNU libraries, particularly the GNU C Library (glibc), are fundamental components of most Linux distributions. glibc provides the system calls and basic functions such as malloc, printf, open, exit, and many others. These routines are essential building blocks for programs written in C and C++.

However, the version of glibc is critical for binary compatibility. If a binary was built on a system with a newer glibc version, it may not run on a system with an older version of glibc. This is because the binary may reference symbols (i.e., functions or variables) not present in the older glibc.

The toolchain is another crucial component. It’s a set of programming tools, including a compiler, linker, and debugger, often used together in software development. The most common toolchain on Linux is the combination of the GNU Compiler Collection (GCC), GNU Binutils, and glibc.

These tools are typically used in concert to compile source code into runnable binaries. Differences in the versions of these tools can lead to binary incompatibilities. For instance, if a program is compiled with a feature only available in a newer version of GCC, the resulting binary might not run on systems with older GCC versions.

In short, the versions of glibc and the toolchain components can directly impact whether a program can run on a given system. Software developers often have to consider these factors when developing applications meant to run on various Linux distributions. As such, they might need to build their software on older systems or utilize compatibility libraries to ensure the broadest possible compatibility.

Linux Today: Diversity and Specialization

Today, there are hundreds of Linux distributions, each catering to different user types or system requirements. Some, like Alpine Linux, are optimized for size and speed to run in resource-constrained environments. Others, like Kali Linux, come pre-packaged with tools for specific tasks such as network analysis and penetration testing.

Conclusion

I am leaving you with pictorial view found on internet. There are several resources for such representation I feel this one is fairly accurate and has most of the flavors derived/influenced by Unix.

History of Unix — Wikipedia

References

http://www.netneurotic.de/mac/unix/images/UNIX.png

--

--