Insanity over Innovation

The Software Professionals Quest for Historic Insignificance

Kelly Curtis
Apr 15 · 13 min read

Here we are in 2021; the World is changed. Covid has solidified our dependence on the Internet forever, and Intellectual Property around the Internet means nothing. Before the mid-1990s, telecommunication technology was incapable of transmitting data at any applicable rate. The Internet itself is running on a set of standards and protocols that are accepted industry-wide. This communication model is called the Open Systems Interconnection (OSI) Model, and it governs how data flows across the Internet. It is perhaps the most widely accepted standard in the technology field.

The introduction of this new model that allowed any two hardware devices to communicate with each other created a whole new world of problems for Software Professionals. No longer were software systems physically constrained in the same way; machines could be wired for hundreds of miles and transmit at the millisecond level. This created the initial need for tools that could handle higher-level systems design and implementation.

Thus begins the quest for control of industry-standard programming languages, operating systems, and tools. Things like Rapid Application Development tools become mandatory as Time to Market drives economic decisions around technology. This created opportunities for companies to develop tools to help other software engineers. Enter Microsoft, Oracle, Google, and others; the wars begin.

There are three primary camps for programming language tools for Software Professionals:

When a software engineer needs access to physical resources on a hardware device, they rely on low-level programming languages: high-performance computing, graphics, IoT devices, firmware, and custom microcontrollers. Essentially, the focus is on HOW the machine performs the algorithm in question.

When a software engineer needs to express a HUMAN-driven decision, they rely on high-level programming languages: business applications, data-driven systems, business intelligence, analytics, and general computer use cases. Essentially, the focus is on WHAT the machine needs to do for you.

When a software engineer needs to perform a highly focused Computer Science task, they rely on more specialized languages: 4th generation languages, SQL, Real-time analytics, no/low-code, ETL, and custom needs. Essentially, the focus is on the NICHE needs.

C, C++, and the low-level languages

In the beginning, programming languages were dealing with physical punch cards running through the Computer. The machines were the size of buildings and required a lot of people to manage. 1st and 2nd generation languages deal with those issues. There is something magical about the number 3, and as pattern predicts, the 3rd generation languages strike that delicate balance between machine and human-readable instructions.

The first language to emerge as a 3rd generation language is C. It allows for human-readable machine instruction sets that still allow for fine-grain control of machine resources. Closely connected to the *nix community, it creates the essential standard for Software syntax: the C syntax. It is a set of symbols, patterns, expressions, and constructs that have existed for 49 years. These languages maintain high popularity today and are the go-to tools for low-level programming languages. Most programming languages today have some semblance of C in them.

Open Source and Java

Supported by Oracle, UC Berkeley, and more academic ventures, the *nix community creates the Open Source Software models we have today. This comes at a time when having Software run on multiple Operating Systems starts to matter as Linux grows in popularity and distributed computing needs emerge. Oracle and Sun Microsystems develops Java to meet this need.

The Java Virtual Machine (JVM) is designed, entirely abstracting the hardware and the operating system at the software level. It is easy to use and manages physical hardware resources for you. It becomes entrenched very quickly into the Software Development Community as it runs on all the operating systems. Google uses Java to create their Android Operating system to support their Mobile devices market.

2021, Oracle loses a historical case of copyright infringement against Google in relationship to Java. The United States Supreme Court rules that Google can do anything they want with Java under the “Fair Use” clause.

Microsoft and C#

IBM and their inability to see the value in Software over Hardware in the 1980s led to Microsoft having a stranglehold on the market. Most personal computers (PC) were running Microsoft technology. With their foot so deep in the market, they tried to set and control industry standards. Microsoft needed to have a tool that competed with Java. They went to great lengths to create C# and the dotnet framework, hiring Object Pascal’s Delphi language architect to design a new general-purpose language to move the market. Like Java, it is easy to use and manages physical hardware resources for you. Due to the sheer number of machines running Windows, C# grows in popularity very quickly.

Internet Browsers and Javascript

Around 2000, internet browsers become the go-to for computer usage as the Internet begins to entrench itself in our lives. This leads to the development of Javascript (the most poorly named language as Java means something entirely different). It is a loosely typed scripting language meant for simple manipulation of the Document Object Model (DOM) in a browser. As the dotcom boom happens, more Software is written in the browser, driving Javascripts popularity to the top 10 almost overnight. Not designed for anything beyond the browser, Javascript professionals started building tools and frameworks, such as Node.js, to handle a more diverse set of technical needs. Thus begins the industry’s quest for reinvention over innovation, as every major component is being RECREATED in other programming languages.

Apple, Objective-C, and Swift

1990–2010 lead to some of the most radical increases in hardware performance. Devices were getting smaller, faster, and we had new ways to interact with the machine, like touchscreens. Apple develops the iPod, and personal computing is never the same. iOS and the iPhone have become an unbeatable force in the market. They use custom hardware, a custom operating system, and a custom development toolkit. They developed Objective-C and the cocoa framework to develop apps. This decision follows in line with Apple’s history of having complete control over the hardware and software development process. By demand of the hardware alone, the language and tools Apple developed became popular to use in the community. They continue down their custom path with Swift and new development tools.

In April 2021, Objective-C dropped below the top 20 in the TIOBE index for programming languages. After ten years of being one of the most popular, it is now essentially dead, as FORTRAN claims popularity above it. A classic example of how new languages force REINVENTION over innovation as all Objective-C projects will need to be rewritten, reworked, refactored, and reinvested to make them functional again.

Python and 4th Generation Languages

It is impossible to have a discussion around programming languages without Python. It is one of the most popular languages globally due to its ease of use, simple Syntax, and vast libraries and frameworks. It deviates from the C syntax, and it gives up fine-grain control of the machine in lieu of readability and simplicity. It allows for very specialized workloads from industry leaders in fields outside of Computer Science.

Most 4th generation languages have a similar catch; they sacrifice complexity at the machine level instructions by making higher-level abstractions easier to work with. The Structured Query Language (SQL) standard comes out of the need to process, transform and work with set data. Not designed to handle machine-level instructions in any capacity, it focuses on WHAT you need to do with the data.

Google, GoLang, and Rust

Google is a technology giant with ownership of the biggest internet search engine. As more users got on the Internet, they started to focus on more technical endeavors. They built the Android Operating System using Java and designed a mobile application platform to compete with Apple and the iOS platform.

In 2009, Google released GoLang, a new general-purpose systems programming language. Designed to compete with Java, C++, C#, and already EXISTING technology, GoLang offers nothing new beyond basic multithreading mechanics. The only significant adoptions for GoLang come from Silicon Valley: Docker and Kubernetes being the biggest. The Syntax breaks a lot of the C syntax standard and forces existing professionals to relearn simple things. Peeking in popularity in 2017, GoLang has been in decline. Another classic example of a new programming language that recreated the proverbial wheel, only to lose traction after a few years.

Mozilla, the company responsible for Firefox, one of the World’s leading internet browsers, created Rust. It is a new approach to memory management in programming languages. The only thing Rust brings to the table is Memory Safety. Used to develop cryptocurrencies, low-level components, and small deployable applications, Rust has virtually no significant support. Even with the one million dollars secured in a non-profit funds, it is doubtful to see Rust ever compete with Java, C#, C++, or even GoLang (each with billion-dollar backings). The Syntax and expressions in Rust break conventions and standards in the community. The Memory Safety tooling should have been an extension of C++ instead of a whole new ecosystem and syntax expression model. Rust fails not for the memory safety features but on the REINVENTION of so many basic concepts.

Historical Significance

What makes something have historical significance? This is far from easy to answer, mainly since programming languages have only existed for 75 years. How can one today know if something is going to have any long-term traction?

To answer this, let us look at other fields in the World.

Charles Dickens — Wikipedia

Ask any PROFESSIONAL author of English literature if they can read Charles Dickens; 100% will say: “Yes, mandatory reading.”

Wolfgang Amadeus Mozart — Wikipedia

Ask any PROFESSIONAL classical Musician if they can read Mozart; 100% will say: “Yes, mandatory reading.”

Pythagoras — Wikipedia

Ask any PROFESSIONAL Mathematician if they can read Pythagoras; 100% will say: “Yes, mandatory reading.”

Robert Boyle — Wikipedia

Ask any PROFESSIONAL Chemist if they can read and understand Boyles law; 100% will say: “Yes, mandatory reading.”

Each of these people existed in a world before the invention of digital media. Their ideas continue to be preserved by industry leaders, academic institutions, and people around the World. They all contributed in different ways that have reshaped the World. Humankind can read and understand each’s respective output in its original form.

The common denominator here is that HUMANS can still read the output from these people. Even if it requires some translation, the EFFORT burden is more than worth the INVESTMENT. People, HUMANS are the center of historical significance.

Only when a generation after the creators death can read, study and master the techniques does historical relevance mean anything. This is what separates every other field in the World from Software. Only in the field of Software is there no importance placed on the historical preservation of ideas. Having online code repositories is NOT the same as having people that can read and work with that code in ten years.

Historical Insignificance

Let us move to Computer Science

Margaret Hamilton (software engineer) — Wikipedia

She is responsible for the software code that sent the human race to the Moon. She developed the software that allowed the Computer onboard the space shuttle to conserve resources and manage systems overloading. A brilliant software engineer, and computer scientist, her contribution should be well known. Especially in the 21st century, as Software Engineering is the most in-demand skill set in the World.

Ask any Software Professional if they can read Margaret Hamilton’s code; 98% will say: “uh… who is Margaret Hamilton? What language was it? Fortran? Oh yeah, those Fortran people are idiots; they don’t know anything about code; [Insert Language of Choice] is the only important one.”

Here we have a scientist, engineer, and Software expert sending Humans to the Moon. Her work changed the World, and yet, barely 35 years after her death, there are virtually NO PROFESSIONALS that can understand and read her actual work. A straight-up embarrassment to the field of Computer Science and Software. The root cause we can’t read it is because of all the basic syntax and expression variations.

Let us take the Pythagorean theorem:

a² + b² = c²

Pretty simple, with no variations for people to translate and understand.

Software Professionals:


a = 3
b = 4
c = math.sqrt(a ** 2 + b ** 2)


int a = 3;
int b = 4;
int c = Math.Sqrt(Math.Pow(a, 2) + Math.Pow(b, 2));


var a := 3
var b uint8 = 4
var c := math.Sqrt(math.Pow(a, 2) + math.Pow(b, 2))


int a = 3;
int b = 4;
int c = sqrt(pow(a,2) + pow(b,2));


int a = 3;
int b = 4;
int c = Math.sqrt(Math.pow(a,2) + Math.pow(b,2));


var a = 3
var b = 4
var c = Math.sqrt(Math.pow(a,2) + Math.pow(b,2))

Visual Basic:

Dim a as Integer = 3
Dim b as Integer = 4
Dim c as Integer = Math.Sqrt(Math.Pow(a,2) + Math.Pow(b,2))

Here we are, a SIMPLE algorithm, and there is already a tremendous amount of expression variations. Software Professionals will fight you till you are blue in the face that Syntax is the WORST reason to discredit any new Programming Language. Yet, these same professionals can’t read or understand programming languages from 35 years ago — only a little bit of an oxymoron.

Syntax variation is the primary and number one reason Technical Debt is so challenging. Bleeding edge only a few years ago, technology stacks are constantly refactored in favor of new syntax expressions for logic. The logic does NOT change, but the Software Professional mandates that Syntax be changed and updated according to some Silicon Valley false standard. The Software practitioner will develop one million reasons why they need to update Syntax: keeping skills current, security, features, ease of use, saving money, spending money, and a million others.

In the Software field, they can NOT agree on something even as simple as a semi-colon (;). Curly braces, brackets, indexers, parameters, function declarations, namespaces, and other symbols MUST come with a variation. Even the International Standards Organization (ISO) can NOT enforce or keep new ideas within any standards. Fights ensue over this, and yet, Professionals cannot read languages from 35 years ago — talk about embarrassing for Software Professionals. Hardware professionals and Networking professionals adhere to the OSI, ISO, and other standardizing bodies, but NOT the Software Professionals.

Here’s the catch, there are 280 written languages for Wikipedia, the World’s biggest online encyclopedia. All those languages took the Human race at least 10,000 years to produce, refine, and preserve. It encompasses, let us say, 10 BILLION individual human beings throughout history spanning across every discipline in the World. That is a ratio of 10,000,000,000 / 280 ~= 37 Million people per language. Pretty impressive actually, 37 million humans can COMMUNICATE with each other.

In the field of Software, 700 or so exist. Seven hundred in only 75 years; Wow! That is a lot. How many people in the World today can read and understand Software? This article said 18.5 million in 2013, so let us say 20 million people now. That is a ratio of 20,000,000 / 700 ~= 28,571 per Programming Language. Even taking only the top 20 programming languages, that is 1 million people per language, not even enough to fill up 10% of New York City. With 7 billion people on the planet, about 0.28% of the World can understand what Software code is, let alone understand it across 700 languages. Pretty NOT impressive; actually, more of a major embarrassment to Humankind.

How Many Programming Languages are there in the World? | CodeLani

There are 18.5 million software developers in the World — but which country has the most? — TechRepublic

Insanity over Innovation

There are currently 62,000 technology jobs listed on The field is barren and lacking significantly in Professionals. The fact that the few of us that understand Computer Science and Software cannot agree to ANYTHING is creating a HUGE drain on the industries that depend on Software Systems. There are NOT enough Professionals to go around, and it is only made worse by the sheer amount of HUMAN level translation going on between programming languages.

Find Jobs in Tech | | Find Jobs in Tech

In 2021, if a software professional CAN’T EXPRESS A solution using C, C++, Java, C#, Python, Javascript, or some existing tool or language; it is a 100% guarantee that the problem is a lack of Computer Science knowledge and experience. There is NO NEED for any NEW programming language syntax and expression variations.

Yet, so it goes in the Software field. A small handful of Software Professionals that lack any historic awareness create a new programming language under the pretense that it MUST be a language problem. They care nothing for the technical debt they make, the rework they cause, the waste they create, the fracturing of professional attitudes, or the intellectual effort required for translating new languages to old ones. History be damned; IT MUST BE A NEW SYNTAX.

It is INSANITY over INNOVATION. Even today’s greatest Software minds are headed straight for Historic Insignificance for no other reason than their work will be unreadable by future practitioners. At the rate things are changing, REINVENTION of the fundamentals is overshadowing the value of the INNOVATIONS.

We are doomed for continuous FAILURE.

Ask yourself honestly? Is there a NEED for NEW programming languages? The answer: NO, we only NEED new language extensions, compilers, and frameworks.


Everything connected with Tech & Code

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store