Intel 4004 and Altair 8800: Architects of Silicon Valley

Joaquin Bas
CodeX
Published in
8 min readMar 24, 2022

In mathematics (differential calculus to be exact), the term “inflection point” is used to describe the point at which a function changes concavity, or in layman’s terms, undergoes a change in its curvature. Though still a technical term, the concept of an inflection point can be applied to many areas outside of mathematics. For instance, in history, inflection points can be thought of as places where the course of history was altered by a certain event or paradigm shift. The creation of the first calendar, the invention of the Hindu-Arabic numeral system, Christopher Columbus landing in the West Indies, and the invention of the printing press by Johannes Gutenberg are all prominent examples of historical inflection points.

Yet not all historical inflection points are in the distant past. If one wishes to find the latest great historical inflection point, he or she must look no further than 1970’s Silicon Valley. It is an understatement to say that the microprocessor changed the world. Before this technological development, computers ran on mainframes that required interaction with punch cards, maintenance of cumbersome vacuum tubes, and time-sharing. Moreover, they cost around 1 million dollars each, so only the military, universities, and wealthy corporations could have them.

Contrast this with the current state of computing technology. The average citizen of even a modest income can walk into a Best Buy and walk out with more processing power available to the Apollo 11 NASA crew. This seismic transformation was not a happenstance however and is a direct consequence of the development of the Intel 4004 processor and Altair 8800 personal computer. From these two inventions, a slew of other developments in the 1970s, like MS-DOS and the Apple 1, followed, changing both technology and Silicon Valley forever. The object of this paper is to outline the history and causes of the 1970s microprocessor revolution and how this revolution in turn transformed both the technology and venture capital industries in Silicon Valley. Answering these questions is integral to understanding the current state and future of the world’s greatest technology hub.

In order to understand the development and causes of the Intel 4004 microprocessor and Altair 8800, some essential computing history is in order. While machines for aiding humans in calculations have been around for millennia, from the Greek Antikythera mechanism to the Hindu Abacus, the first programmable computer known as the “difference engine” was made in 1833 by English mathematician and inventor Charles Babbage. Babbage’s work inspired fellow genius and polymath Ada Lovelace to continue Babbage’s work by working on a program known as “Jacquard loom” for Babbage’s conceived computer towards the middle of the 19th century. With the rise of electronics in the 20th century, Babbage and Lovelace’s ideas of programmable computers were improved using electronic components such as vacuum tubes. The first of these was the “differential analyzer”, the scientific computer constructed strictly for the purpose of solving complex differential equations, invented by Vannevar Bush of MIT in 1927(IEEE Spectrum). Bush, who rose to power as President Franklin D. Roosevelt’s head of the National Defense Research Committee, also helped with the creation of the famous Electronic Numerical Integrator and Calculator, or ENIAC computer, which ironically, was not completed until after the war in 1946. However, the ENIAC’s British counterpart the Colossus was instrumental in winning the Second World War by doing everything from calculating bomb trajectories to cracking German ciphers.

After the war, however, there was no immediate need for computing, so for a short while computing technology stagnated. However, this period ended when British mathematician Alan Turing wrote to fellow mathematician John Von Neumann in 1946 about the possibility of a stored-program computer. As soon as this news got to the United States, research began immediately. Just one year later, physicist William Shockley and his colleagues at Bell Laboratories invented the metal-oxide-semiconductor-field-effect transistor (MOSFET), a device that redirects and amplifies electronic currents. Transistors are the building blocks of computers. Without them, stored programs, logic gates, and everything else that makes up modern computers cannot function efficiently enough to be usable. Even in 1956, before transistors were an integral part of computing machines, and when Shockley won the Nobel Prize for the MOSFET transistor, he knew that his invention had a value greater than as a science experiment. Just a few months after winning his Nobel Prize, in 1956 William Shockley founded Shockley Semiconductor Laboratories, whose headquarters would be in Palo Alto, California. However, this company was not very successful, as Shockley’s domineering personality and growing paranoia enraged his employees, eight of whom left to start a new and improved semiconductor manufacturer: Fairchild Semiconductor in 1957. Silicon Valley was born.

The Intel 4004 processor, the computer processor behind the microcomputer revolution of the 1970s has a predecessor: the Fairchild 8 bit integrated circuit(CPU world). In 1960, physicist and Fairchild employee Robert Noyce invented the integrated circuit. For context, an integrated circuit is a mass of transistors arranged in order to process boolean and binary algebra or logical operations within a computer system. Without integrated circuits, modern computers could not exist. Gordon Moore, an engineer at Fairchild and colleague of Noyce, noted in 1965 that “The number of transistors and resistors on a chip doubles every 18 months”, and this phrase today is known as Moore’s Law. From 1960 to 1968, Noyce and Moore perfected their integrated circuit design, and finally, after realizing its commercial potential, decided to form their own company to manufacture microprocessors. On July 18, 1968, Intel was founded.

Only 4 years after its founding, Intel struck gold. Its 4004 microprocessor had everything necessary to lead the way into personal computing. Not only did it make use of new silicon gate technology (which decreases heat overload and improves efficiency), but it also was a modular chip that could be modularly integrated into different computing systems with separate I/O, RAM, and ROM, functions. This new design paradigm ushered in a wave of 4004 derivative processors from other companies as well as from Intel (CPU world). In 1974, Intel released the 8080 CPU, a scaled-up 4004 microprocessor with around 3 times as many transistors as the 4004 (freakishly close to Moore’s law if you do the math) and even better modularity. Not surprisingly, it would be the Intel 8080 that powered the first true personal computer.

Had it not been for the Altair 8800, Silicon Valley as we know it today would not exist. Apple, Microsoft, and many other technology companies focusing on personal computing software and hardware were born from Silicon Valley’s obsession with this tiny computer. The Altair 8800 wowed professional engineers and electronics hobbyists alike because it was one of the first computers that one could put together for a relatively low cost and use. What’s more, because of the Intel 8080 processor, one could customize many of the ports and peripherals to the computer by merely taking a trip to the local electronics store. The Altair 8800 gained so much popularity among hobbyists that in 1975 Popular Electronics featured the computer on its front cover. What’s more, the Altair 8800 not only engendered a hardware revolution but began Silicon Valley’s transition into software as well. In March 1975, Ed Roberts, the founder of Micro Instrumentation and Telemetry Systems (MITS), the small electronics firm behind the Altair 8800, received a letter from two college students wondering if they could write some BASIC code for the Altair and form a partnership with Roberts. Roberts let them write the code for the Altair, but was not too interested in forming any business ventures. So, the two college students formed a small software company in Albuquerque, New Mexico, to sell their BASIC code for the Altair (now officially known at this point as Altair-Dos) and license it to other machines. The college students’ names were Bill Gates and Paul Allen, and their little company was named Microsoft.

Because of the Altair 8800, not only was Microsoft formed but so were many other Silicon Valley behemoths. In fact, Steve Wozniak and Steve Jobs got their idea for Apple, which they started in 1976 after much experience tinkering with Altair 8800 computers at the Homebrew Computer Club in Menlo park just a year earlier. A few years later in the early 1980s when Microsoft and Apple went public, turning their employees and founders into multimillionaires, the entire technology industry took notice, and Silicon Valley would never be the same.

If one wishes to start a company today in Silicon Valley, an essential element for the prospective business to grow is Venture Capital. Modern Venture capital is essentially a firm made up of outside investors agreeing to fund a high-risk, high reward project for some stake in the company. Because of its inherent risk tolerance, almost half of the entire modern Venture Capital industry in the United States is centered in Silicon Valley, and more than 60% of venture capital dollars each year go to Silicon Valley firms. The 1970s microprocessor and microcomputer revolution, which created Microsoft and Apple, made previously risk-averse private equity firms willing to invest in small-capital but high-potential computer or software companies. Following this trend has paid off for Venture Capitalists in Silicon Valley, as many continue to make fortunes off of investments in more recent technology companies like Amazon, Facebook, and Google. Unfortunately, the 1970s microcomputer revolution also inadvertently harmed innovation in Silicon Valley forever. By showing investors that computer technology could follow Moore’s Law and become so scalable, the entire Venture Capital industry in silicon valley continues to overinvest in software ventures and ignores other worthwhile technologies in unrelated industries. Especially in a time when innovation is needed in areas like clean energy, healthcare, transportation, and more, still, over 85% of Venture Capital funds in Silicon Valley go to software companies (Crisis of Venture Capital). Not only has this investment paradigm halted hardware innovation in the valley (save for exceptions like Tesla), but it also has made the Venture Capital industry less and less technology-focused. In an industry that once prided itself on investing in the future, current executives of Venture Capital firms now seek to merely turn a profit.

Having explored the history of the microcomputer revolution and its effects on the Venture Capital industry and the technology industry in Silicon Valley, it is now much easier to predict the future of this land of technology. Unless there is another major historical inflection point, that is, unless someone invents another non-digital technology that is scalable with high commercial potential, Silicon Valley’s venture capital firms will continue to overwhelmingly fund software companies and ignore other types of technology. If this trend persists, America’s most innovative region will lose its technological supremacy and sense of invention. If this happens, perhaps the next technological inflection point will occur not in the laboratories of Stanford or Intel, but in yet-to-be-discovered innovation centers across the globe.

References:

Cass, Stephen. “Chip Hall of Fame: Intel 4004 Microprocessor The first CPU-on-a-chip was a shoestring crash project.” IEEE Spectrum, spectrum.ieee.org/chip-hall-of-fame-intel-4004-microprocessor. Accessed 27 Sept. 2021.

Funk, Jeffrey. “The Crisis of Venture Capital: Fixing America’s Broken Start-Up System.” American Affairs, vol. V, no. 1, spring 2021, americanaffairsjournal.org/2021/02/the-crisis-of-venture-capital-fixing-americas-broken-start-up-system/. Accessed 27 Sept. 2021.

“Intel 4004 Processor Family.” CPU World, www.cpu-world.com/CPUs/4004/index.html. Accessed 27 Sept. 2021.

Kaye, Glynnis Thompson, editor. “A Revolution in Progress- A History to Date of Intel.” Intel, 1984, www.intel.com/Assets/PDF/General/15yrs.pdf. Accessed 27 Sept. 2021.

Mazor, Stanley and Anne Fitzpatrick. “Fairchild Symbol Computer.” IEEE Annals of the History of Computing, vol. 1 no. 1, 2008, p. 92–95. Project MUSE muse.jhu.edu/article/235250.

Moore, Gordon E. “The microprocessor: engine of the technology revolution.” Communications of the ACM, vol. 40, no. 2, Feb. 1997, pp. 112+. Gale Academic OneFile, link.gale.com/apps/doc/A19238429/AONE?u=nm_p_oweb&sid=bookmark-AONE&xid=e00cdd43. Accessed 21 Sept. 2021.

Moore, Fred. “Homebrew Computer Club Newsletter Number 5(July 5, 1975).” DigiBarn Computer Museum, www.digibarn.com/collections/newsletters/homebrew/V1_05/index.html. Accessed 27 Sept. 2021.

“Popular Electronics January 1975 Issue ‘Altair 8800.’” World Radio History, worldradiohistory.com/Archive-Poptronics/70s/1975/Poptronics-1975–01.pdf. Accessed 27 Sept. 2021.

--

--