Economics 101 For Engineers
Unless you are part of engineering team at Large Hadron Collider, Mars exploration Rover mission, Facebook or Google dealing with super-scalability and machine learning problems (basically you belong to the top 1 percentile of industrial engineering or any other crème de la crème academic research-based engineering or reality-distortion-field of Silicon Valley), it’s paramount to reflect upon the core issues put forward by this chapter. Nevertheless, these are my opinions inevitably colored by experiences, influenced by observations and validated by a potentially limited set of data points. Like many opinions, you are welcome to reject, augment or agree upon with reasonable data sets in a constructive manner.
It all started on June 30, 1945 when John Von Neumann published the First Draft of a Report on the EDVAC, the first documented discussion of the stored program concept and the blueprint for computer architecture to this day. But the real disruption didn’t start until the invention of Ethernet by Robert Metcalfe (and David Boggs; both were working at Xerox PARC) on May 22, 1973, the day he circulated a memo titled “Alto Ethernet” which contained a rough schematic of how it would work. “That is the first time Ethernet appears as a word, as does the idea of using coax as ether, where the participating stations, like in AlohaNet or ARPAnet, would inject their packets of data, they’d travel around at megabits per second, there would be collisions, and retransmissions, and back-off,” Metcalfe explained.
In networking parlance it’s known as CSMA/CD (Carrier Sense Multiple Access with Collision Detection). Ethernet has been widely used in offices, factories, schools and homes, increasing its bit rate over the years from 10 Mbps to 100 Mbps, 1 Gbps, 10 Gbps, and recently 40 Gbps and 100 Gbps. It was a fantastic demonstration of algorithmic innovation by an electrical engineer (who later founded 3Com) and actually kickstarted a culture of technology-driven innovations, disruptions and venture (and wealth) creations which would later define the DNA of silicon valley and United States of America, in general.
Then came Douglas Carl Engelbart, an engineer and innovator who single-handedly changed the course of Human Computer Interface (HCI). In 1960, he developed the oN-Line System (NLS) at the Augmentation Research Center at SRI International in Menlo Park, California. This computer incorporated a mouse-driven cursor and multiple windows used to work on hypertext.
Several people went from SRI to Xerox PARC in the early 1970s. In 1973 (co-incidentally during the same time when Metcalfe was inventing Ethernet) , Xerox PARC developed the Alto personal computer. It had a bit-mapped screen, and was the first computer to demonstrate the desktop metaphor and graphical user interface (GUI) — another algorithmic innovation in the form of WIMP (Windows — Icon — Menu — Pointing Device) paradigm.
With GUI and Ethernet (later standardised as LAN or IEEE 802.3) invented, the stage was set for real fireworks. In 1980 IBM, which enjoyed a virtual monopoly in the mainframe-computer market, decided to enter the personal-computer market. They opened a small Entry Systems Division in Florida under the direction of Donald Estridge, who opted for building a computer from off-the-shelf, widely available components. One of the reasons for this decision was that IBM was still wary of an antitrust lawsuit brought against them by the government in 1969. The best way to avoid accusations of monopolistic practices was to make the specifications available to its competitors. IBM put William Lowe in charge of the top-secret project, code-named “Acorn”. To start with, they chose the Intel 8088 microprocessor instead of a proprietary IBM microprocessor (IBM had already acquired the rights to manufacture Intel chips). IBM did not have an operating system for Intel’s processors, so it was necessary to buy one from a third party. When (in 1978), Intel had introduced the 8086, a young Seattle programmer, Tim Paterson, had been hired by Seattle Computer Products (SCP) to develop a CP/M-compatible operating system for it, and in december 1980 he finished work on his 86-DOS. Asked by IBM to deliver an operating system for the 8086, in 1981 Bill Gates’ Microsoft bought the rights on 86-DOS from SCP and hired Paterson to port 86-DOS to the first prototype of the PC provided by IBM. It was renamed MS-DOS, and Microsoft decided to retain the rights on the operating system. IBM launched its PC in august 1981. The basic version with 16 kilobytes of RAM and a cassette unit sold for $1,600. Another revolutionary move by IBM was to let outside distributors (initially Sears & Roebucks and Computerland) sell the PC. The best-selling computer in the history of IBM had sold only 25,000 units over five years: the PC would sell a million units in less than three years.
Just freeze the time for a moment. Feel yourself teleported back in time to the legendary 1973–1982 era, surrounded by these tremendously talented people, all trying to define what we know as Information Technology today. A whole generation of engineers (many of them were national and international immigrants) has just started to defy the status-quo, leverage their awesome creativity to solve problems that is going to have a invasive and pervasive impact on our lives — all clustered around the unique and unreplicable ecosystem of Bay Area. Back then, being an engineer was fun, thrilling and potentially lucrative career option (more so because of the proliferation of Venture Capital industry during the same time especially around Sand Hill Road at Menlo Park, California). Engineers from their own little garages have started to establish the trend of disrupting a vast range of industries and this very act would later form the core part of the DNA of Silicon Valley (and many other innovation hubs, subsequently). And businesses across the world was about to adopt Information Engineering as an important part of their strategic and competitive advantages. The “prosperity” phase of a new Kondratiev-wave has already begun.
What the heck is a Kondratiev-wave ?
The Soviet economist Nikolai Kondratiev (also written as Kondratieff) was the first to bring the concept of cycle-like phenomena in the modern world economy to international attention in his book The Major Economic Cycles (1925). The theory claimed that the world economy progresses through a series of Kondratiev-waves (or K-waves) or cycles, each having phases of Prosperity, Recession, Depression and Improvement.
Upon retrospection, it seems the world economy since 1771, indeed have followed a series of K-waves, each having a period somewhere between 40 to 50 years -
- Steam Engine & Cotton (1780–1830)
- Railway & Steel (1830–1880)
- Electrical Engineering & Chemicals (1880–1930)
- Petrochemicals & Automobiles (1930–1970)
- Information Technology (1970 — today)
A continuous cyclical cascade of innovation & commercialization, one propelling another, throughout 200 years (and counting) of modern economic development, Information Technology being the latest one. Technology and economic historian Carlota Perez, has interpreted the phases of a K-wave slightly differently making it a bit easier for our understanding -
In the first phase Eruption, breakthrough technologies or innovation starts to disrupt the old & existing industries, unemployment declines, paving the way for second phase Frenzy, where massive amount of financial capital is invested in the new technologies with the expectation of sustained competitive advantage over competitors, often creating a bubble. Subsequently “inflection point” arrives, a Crash happens ushering us to the Synergy phase where a coherent growth of economy is observed again along with a “re-coupling” of financial & physical capital. At last comes the prolonged Maturity phase where market reaches to a saturation, growth becomes stagnant, competitive advantage of earlier technological breakthroughs erodes and gets transformed into commodity.
Some of you might be thinking why such a long story about the nitty-gritty of an apparently insignificant economic event. Because it’s highly related to the overall economic ramifications of wide adoption of Information Technology and Information Engineering. If we assume the average cyclical length of a K-wave is 50 years, 2020 (1970+50=2020) should be the beginning of a new K-wave and more importantly we are currently living at the Maturity phase of the K-wave related to Information Technology.
What’s wrong in living at a Maturity phase of a K-wave?
For consumers it’s actually good. The “perfect competition” will drag the commodity prices down. Great ! But for engineers and creative folks (on the supply-side), it has something to do with “Fun”, “Fulfilment” and “Fee” related to work !
To understand how, we have to resort to another economist — Raymond Vernon, the late Clarence Dillon Professor of International Affairs Emeritus at the Kennedy School of Government, Harvard University. Raymond introduced a concept called “Maturity Cycle” which sounds a lot like Kondratiev-wave but extends the original K-wave theory by taking a more micro-economic position around “Product Life-cycle”. The theory suggests that early in a product’s (ex. Personal Computer or PC) life-cycle, all the parts and labor associated with that product come from the area in which it was invented (ex. USA). After the product becomes adopted and used in the world markets, production gradually moves away from the point of origin (ex. to Asia/China). In some situations, the product becomes an item that is imported by its original country of invention.
When a product enters into the phase of “maturity” and it’s production moves to a country or region with production-cost-advantage, engineering challenges pertaining to its production usually reduces a lot. Although “additional challenges” may crop up around distribution or economy-of-scale, the original engineering turns into something “boring” or “mundane”. In economics this is called Commoditization.
Commoditization not just erodes almost all the competitive advantages of companies operating in the respective sector, it also does something more sinister to the employees of those companies. These employees (engineers in particular) gets demotivated by the very mundane aspects of day-to-day jobs solely focusing on the survival of sinking ships. Suddenly there is no room for innovations and risky undertakings, no scope for R&D, no business value in showing creativity and almost no approved budget for New Product Developments. Fun has suddenly disappeared and it’s boring as hell to be out there.
As Nicholas Carr in his controversial 2003 Harvard Business Review article “IT Doesn’t Matter” observed -
IT’s potency and ubiquity have increased, so too has its strategic value. It’s a reasonable assumption, even an intuitive one. But it’s mistaken. What makes a resource truly strategic — what gives it the capacity to be the basis for a sustained competitive advantage — is not ubiquity but scarcity. You only gain an edge over rivals by having or doing something that they can’t have or do. By now, the core functions of IT — data storage, data processing, and data transport — have become available and affordable to all. Their very power and presence have begun to transform them from potentially strategic resources into commodity factors of production. They are becoming costs of doing business that must be paid by all but provide distinction to none
Consider the early days of IT & software engineering, when quality information engineering resources were scarce and hence much sought-after. Let’s go back to March, 1981,
UK to witness the launch of ZX81a — a home computer produced by Sinclair Research and manufactured in Scotland by Timex Corporation. The first 3D game
for a home computer — “3D Monster Maze” have just been completed from an idea by J.K.Greye and programmed by Malcolm Evans for the new Sinclair ZX81 platform with just 16 KB memory expansion !
Besides introducing a new genre of “Survival Horror”, the game was an awesome display of pure programming magic and great achievement in software engineering. It was unbelievable that engineers was able to pull off perspective 3D in such a rudimentary machine. Although it is expected to have only two colors, programmers were able to incorporate a third color — “grey”, using a nifty trick involving a “dithering” pattern consisting of 6 pseudo-graphics elements. The game’s 3D engine and the random maze creation code is written in Z80 machine code, produced with an assembler. WOW ! A classic example where engineering (software/gaming) “firepower” was pivotal in making J.K.Greye Software (and later New Generation Software founded by Malcolm Evans) a very successful gaming company.
Post-1983, after the sudden death of the “golden age” of video games, with massive adoption of PC, software engineering quickly became matured and high-level languages like Python (1991) or Ruby (1993) emerged quickly to further accelerate the business adoption of Information Engineering. From there to the current state of Polyglot (a computer program or script written in a valid form of multiple programming languages) programming and fantastic frameworks (Ruby on Rails or Django for Python etc.) has made it ridiculously easy to build data-driven web applications. It’s not just application themselves, peripheral services are being commoditized at an unprecedented scale. Need monitoring for your app? Use open source libraries like Graphite and Nagios, or use paid services like New Relic. Need a database? You could run your own MySQL / Postgres / Redis or you could pay Google or Amazon to do the functional equivalent. They can handle scale for you. Amazon Web Service (AWS) has jaw-dropping number of services that a business can leverage without re-inventing the wheels of distributed computing, complex resource management and scalability. Distributed systems used to be niche few years back. Not any more. IBM have already unveiled Cognitive Cloud powered by superior intelligence of the Watson platform. Amazon has one too — https://aws.amazon.com/machine-learning/
Commoditization has reached its crescendo.
How companies can regain back the much-needed competitive advantage in such an era? We will discuss this in detail at later part of the book.
As the process of commoditization was unfolding through last couple of decades, business executives were busy enforcing more and more processes to tame the eccentric forces of creativity inherent in software engineering projects. Certain of amount of processes are needed to control project risk and budget, but beyond that it actually stifles creative engineers, substantially lowering their overall productivity and Fulfilment. From XP to Agile, from UML to TDD/BDD we have seen many. There are some good parts, but there exists more “cargo cult” or dogmatism. For executives this creates some immediate benefits. With rigid processes, leveraging the supply-side saturation (ill-effect of living in the Maturity Phase of a K-wave), it became much more cost-effective to hire “not-so-skilled” engineers who suddenly found themselves “replaceable” in a classical commodity market.
Computerworld. August, 1980
Interestingly this trend is nothing new. While researching for this book, I’ve stumbled upon this editorial of Computerworld, August, 1980, — Fig-8. In retrospection, this evokes a combination of eerie and funny feeling. One redditor summarized it well -
I think software design is a creative, qualitative process that SCRUM is trying to apply metric-driven, quantitative ideals with great failure and pain
Not just the “process paralysis”, commoditization hits hard one crucial metric — Fee. As the supply-side saturation (BTW, great sales pitch by universities to reinforce the fact that there exists a shortage of talented workers in STEM fields — science, technology, engineering, and maths) starts showing its full glory, other relevant economics quickly shows up on heels
Michael S. Teitelbaum, VP of the Sloan Foundation, opined that there are no general shortages of scientists and engineers. He went even further, to state that there is evidence suggesting surpluses: there are significantly more science and engineering graduates in the United States than attractive positions available in the workforce. This surplus supply, driven by classical economics, forces engineers to accept lower wages, as seen in Fig-9. Bay Area remains as an outlier in this regard.
Is there any light at the end of tunnel?
If you love investing in stocks, you’ve probably heard about a book titled
“The Little Book That Builds Wealth: The Knockout Formula for Finding Great Investments” by Pat Dorsey. Among the numerous books I’ve read on stock market investments, this one definitely deserves to be placed at the front-row. In this book, Pat talks about how to select great companies not just with great “RoI” but with a sustained competitive advantage (SCA). To attain a sustained competitive advantage he proposes that a company should possess any one or more of the following attributes
- Intangible asset (ex. brands, patents or regulatory approvals)
- Switching cost
- Network effect (ex. two-sided marketplaces, on-demand service economy, social media)
- Cost advantage
In my opinion, these attributes are not just great for screening great companies for investment, but also great for making smart career choices going forward. We are going to discuss a whole lot about customer empathy and brand’s perception in the coming chapters. Beyond brands, patents and regulatory approvals are pretty important as well. According to a recent paper from the investment house Allianz, following areas or sectors are being considered as a potential harbinger of “6th K-wave” -
- Nanotech and Nano-materials (esp. Photonics and NanoBioPhotonics)
- Environment technology, Clean-Tech and Green-Tech
- Health-care and BioPharma/BioTech
- (Some amount of) Artificial Intelligence/Robotics, Machine Learning and Computer Vision (This will eventually be commoditized)
For example, if we look at the trends of BioTech (CCL class 424: Drug, Bio-Affecting and Body Treating Compositions) patents issued by USPTO during 2005–2014
it certainly appears to be an explosion in BioTech research. (source: http://patft.uspto.gov/). Each of these sectors will require lots of research (and patents) and regulatory approvals to create next-generation of investment-worthy and career-worthy companies. So if you are recent graduate engineer or a freshman and passionate about any of these areas, seriously consider to try them out. However there is a catch. Till 2020–2021, until the current K-wave completely gets phased out, the demand and subsequent job-creation will not reach enough “scale” and as a consequence paying bills may be a bit difficult at times unless you can afford it either by a cushion of money or sheer perseverance !
Then what about rest of us? Well, that’s what I’ll try to cover in next few chapters.
[Shameless Promotion: This is CHAPTER 1 from my upcoming book]
Originally published at www.dbose.in on January 8, 2016.