Six Eras of IT — Observing Patterns in Technology & Platform Cycles (Part 1)

Zorawar Biri Singh
8 min readDec 7, 2018

--

An exchange between interviewer David Sheff and Steve Jobs in Feb 1985:

David Sheff: “We were going to say guys like you and Steve Wozniak, working out of a garage only ten years ago. Just what is this revolution you two seem to have started?”

Steve Jobs: “We’re living in the wake of the petrochemical revolution of 100 years ago. The petrochemical revolution gave us free energy — free mechanical energy, in this case. It changed the texture of society in most ways. This revolution, the information revolution, is a revolution of free energy as well, but of another kind: free intellectual energy. It’s very crude today, yet our Macintosh computer takes less power than a 100-watt light bulb to run and it can save you hours a day. What will it be able to do ten or 20 years from now, or 50 years from now? This revolution will dwarf the petrochemical revolution. We’re on the forefront.”

The last several decades of information technology (IT) evolution are indeed an ongoing revolution of ‘free intellectual energy,’ and they have followed distinct, observable eras:

Figure 1: The Six Eras of IT

Illustrated above is how I currently think of the Six Eras of IT (note → this is a work-in-progress list, it covers notable technology and platform developments in both consumer as well as enterprise/B2B and is by no means exhaustive or complete)

The start of the “information revolution” is generally accepted to have begun around 1969–71. At least the first five of these epochs have been well documented, especially within enterprise/B2B IT fields. I believe that we are currently evolving into a 6th era; although unclear when it began, my current estimation is around the end of 2015. Although I have labeled it here simply as “AI + Robotics + IoT + Edge’, I think this Sixth Era is significantly more meaningful than just a collection of technology trends. I’m calling it the Age of Shared Cognition (amongst humans, machines, and autonomous systems) and hope to explore what the next couple of decades will hold, especially when the dual scarcities of *Attention* + *Lack of Trust* pose enormous challenges for society today.

Past platform and infrastructure transitions across these different IT eras have shown us some well-recognized patterns. One pattern, in particular, is that software abstraction and automation consistently emerge as first design principles behind nearly all successful platforms.

The mainframe to client-server transition during the late ’70s and ’80s disaggregated strict batch processing workflows and time-share computing down to end-user PCs and their networked, back-end servers. Similarly, web-based platforms from the ’90s saw a massive transition to mobile and on-demand cloud platforms enabled by incredible innovations in virtualization, silicon, and SoC technologies. Today, with tens of billions of devices coming online across low latency networks and near ubiquitous, cheap, and unlimited computing power available just a few clicks away, we see incredible data-driven digital transformations occur across all forms of industry. Recent advances in machine learning/AI technology ‘stacks’ are re-imagining our user interfaces’ with voice, image recognition, and virtual reality (AR/VR).

One of my favorite examples where these software abstractions and automation patterns manifest themselves is in the history of accounting and business/financial software.

For several thousand years of human history, business processes and workflows of accounting practices remained unchanged until 1494, when Venetian merchants introduced the double-entry bookkeeping system, and accounting ledgers and journals became the norm. From then on, the accounting profession would remain mostly unchanged and entirely paper-based until the ’70s. There was little technological advancement in the trade other than perhaps the quality of ledger paper and ball-point pens!

When IBM and other manufacturers like Burroughs, UNIVAC, NCR, and Honeywell introduced mainframes in the ’60s as a novel way to automate bulk data processing as well as transaction processing, there was little appeal amongst businesses of using costly ‘big iron’ and remote job entry (RJE) procedures for the tedious work of ‘balancing the books.’

Around this time in the US, accounting practices and workflows were formalized by governing bodies like the FASB and ACIPA, and the Generally Accepted Accounting Principles (GAAP) were introduced in the ‘70s.

Paper-based processes and human-powered bookkeeping would continue to reign. Until they no longer did.

After IBM introduced the PC in the ’80s, and paper-based ledgers gave way to electronic spreadsheets on Lotus 123, the impact was as significant as the original double entry system. All of a sudden several accounting software companies appeared, and the industry underwent a massive transformation as books could be closed electronically in a matter of minutes. Early pioneers like JD Edwards, TurboCash, and Quicken gave way to robust, on-premises ERP software suites from Oracle and Peoplesoft in the ’90s. Then as cloud infrastructure and applications platforms matured in the 2000s, SaaS pioneers like Netsuite and Zuora flourished as they took business financial planning and subscription mgmt. workflows and codified them in the cloud.

Today the likes of Anaplan, Adaptive Insights, Intuit, Sage, and Xero are applying AI and machine learning to re-imagine what cloud-based financial planning and collaboration will look like in the future. Automating many mundane tasks with hyper accuracy is just one of the reasons accountants are likely worried.

…software abstraction and automation consistently emerge as first design principles behind nearly all successful platforms.

A second pattern that appears across these technology epochs is the massively increased speed of adoption or technology ‘diffusion’ amongst end-users.

Figure 2: Rates of Technology Adoption

The chart above, curated by economist Max Roser [link] illustrates how quickly a range of technologies ‘diffuse’ or get adopted amongst US households, measured as the percentage of total US households with access or adoption to the particular technology over time. What stands out here is that the relative slope or rate of adoption increased significantly over the past sixty years compared with earlier periods.

Why is that?

One explanation for the appearance of this second pattern of relatively faster adoption is that we saw impressive technological progress get cheaper over this time — Moore’s Law coming of age, improved microprocessor clock speed, capacity, networked power laws, etc. [link]

Another more subtle reason comes from the patterns of software abstraction and automation themselves.

Disaggregating monolithic -or hard-coded -business workflows by systematically unpacking the underlying technology ‘stacks’ and re-packaging as much as possible into software code is the first step in getting to how software ‘eats the world.’ [link]

When these workflows (abstractions) are also further connected or expanded into their natural networks with the additional benefit of well-thought-out APIs, then high-growth, rapidly adopted platforms emerge:

Figure 3: Adoption Ramps, by Horace Deidu [link]

As an example, here we see on the right chart incredible stats that show how Microsoft got to 1 billion users in computing over twenty years, and then how both Apple and Google got to 1 billion mobile users in under ten. The chart on the left is even more remarkable as it models the blistering adoption rates for ride-sharing currently being fought over by v1.0 ride-share and v2.0 ‘micro mobility’ platforms.

Going back to my illustration of the Six Eras in Figure 1 above, we can consistently trace the two patterns that demonstrate how well-executed, top-down abstractions of common application logic and data structures into smaller, reusable building blocks help simplify and automate faster adoption of infrastructure and platforms.

In the mid-2000s, we then see a third pattern appear, which has to do with transformations in IT operating models.

Traditional IT organizations, ones with formal, centralized IT functions under CIOs, took shape after client-server/desktop computing replaced mainframes in the ’80s, and these roles grew significantly amongst F500 corporations. Prior, there were mainframe operators who would sit in ‘data center’ facilities, running time-share computing jobs submitted by remote job entry (RJE) requests. Now in the ’90s and 2000s, well-known IT jobs like sys. admins, DBAs, VM admins., help desks, etc., emerged to support business requirements that relied heavily on IT to enable business intelligence (BI), knowledge management, and financial and end-user applications. As knowledge workers increasingly became mobile, remote workers, enabled by simpler devices and better networking (VPNs), the complexities of IT support and security-driven control over-whelmed CIOs and their teams. IT roles were now frowned upon; it was popular to refer to central IT as ineffective, ‘cost centers’, and end users and business departments sought out simpler, self-service options.

So it was around the mid-2000s that we start recognizing a third consistent pattern, where the central IT operating model of ‘command and control’ from the past gives way to popular cloud-enabled, ‘bring-your-own-device’ (BYOD) self-service models amongst lines-of-business (LOB) and developers. In data centers as well, the traditional, strict IT approach for managing dedicated, on-premises infrastructure evolves as ‘shadow IT’ developers using public cloud infrastructure (IaaS, PaaS, SaaS, etc.) became creators and producers themselves, with much LOB fanfare.

“…the public cloud has now established a ‘rate card’ for IT” — Paul Maritz, then CEO of VMware in 2012

It’s this non-trivial change to the IT operating (and delivery) model brought on primarily by the cloud and mobile users that is a significant development across the Six Eras. This takes us back to Steve Jobs’ notion of an ongoing ‘revolution of free intellectual energy.’

Going forward, almost all businesses are shaping clearer strategies to adopt cloud-first operating models. Increasingly, developers and LOB application owners are incorporating data science and real-time analytics across all business functions. The digital transformation strategy ‘boom’ we see everywhere reflects a business desire to benefit from faster product development cycles and to get to the data-driven real-time applications of tomorrow.

I find myself constantly curious and optimistic about technology improving our lives and the condition of our planet, so I enjoy connecting dots and learning about patterns. In sharing some of the thinking behind the Six Eras of IT, I plan to use the framework to explore additional observations and ideas. For instance, I’m quite interested in what 2019 holds for CIOs and their IT teams’ abilities to navigate the next transitions for their hybrid cloud* environments (*includes the term ‘multi-cloud’).

Checkout:

Enterprise IT in the Sixth Era — Thoughts on The Future of Hybrid Cloud IT.

Up Next:

Six Eras of IT — Observing Patterns in Technology and Platform Cycles (Part 2)

Stay tuned.

Thanks for visiting, and please do share your feedback.

--

--