How the Apollo Program Gave Silicon Valley a Jump-Start

The first integrated circuit chips to reach the market were so expensive that a customer with deep pockets was needed, and the Apollo project came along just in time.

In July we celebrate the 50th anniversary of one of the greatest “leaps of mankind,” the Apollo 11 lunar landing. Conquering space travel and exploring a world a quarter of a million miles away was a triumph of incalculable value. Even though the moon landing was the arguably mankind’s biggest achievement of the century, if not the millennium, there have been times when NASA was asked to name tangible commercial benefits that resulted from its programs. Answers have included freeze-dried food and cordless handheld vacuum cleaners. Although it’s often overlooked, the ancillary legacy of greatest impact is the timely boost the program gave to the successful launch of Silicon Valley and High Technology as an economic sector.

Earthrise as captured during an Apollo mission. The Apollo moon landing was a tremendous achievement in many ways. Though often overlooked, the Apollo program was the largest customer for the first generation of integrated circuits , giving a critical boost to the nascent Silican Valley semiconductor industry.

The bipolar transistor was invented at Bell Labs by Walter Brattain, John Bardeen and William Shockley. Their groundbreaking experiments were performed in December 1947, the results were publicly announced in June, 1948, and they won the Nobel prize in 1956. This was the birth of solid state electronics, a technology that would quickly prove vastly superior to the earlier vacuum tube electronics. Shockley founded the first transistor “factory” in Mountain View, CA, in 1955. In 1957, Gordon Moore, Robert Noyce and six others left the Shockley Transistor Company and formed Fairchild Semiconductor (Moore and Noyce would start another company in 1968, eventually named Intel). Their first commercial transistors sold at the steep price of $150 each [1]. New developments moved rapidly. Photolithography was used to manufacture multiple devices on semiconductor wafers. Jack Kilby (Texas Instruments) and Robert Noyce independently invented the concept of “integration,” where circuits composed of transistors and other electronic components were fabricated on a wafer as a single unit. They made the first demonstrations of an integrated circuit (IC, commonly called “chip”) in 1958 and 1959, respectively.

Transistors and integrated circuits had performance characteristics that were unimaginable even a year before. They were small, rugged, used a fraction of the power of tubes, and entire circuits could be manufactured at once. In 1960, Fairchild marketed several different ICs called micrologic elements. Each was a digital logic gate composed of 3 or 4 transistors plus several diodes and resistors. Several could be wired together to perform basic computing functions. Each was priced at about $120. Integrated circuits of greater complexity, and suitable for wider applications, quickly followed [2, 3].

Solid state circuits were the quintessential example of a disruptive technology. However promising they were, the high price was a tremendous obstacle to their success. No commercial product could afford the high cost. But ICs were ideal for missile and satellite applications, and the government was willing to pay any price for the high performance that was required. It was a win-win for everyone.

In 1963, a half million chips were sold to NASA and the Department of Defense for aerospace applications. Manufacturers include Fairchild, Texas Instruments, and Philco. The government was their sole customer. The first commercial chip application for a product for the broad public came in 1964, a hearing aid manufactured by Zenith.

In 1962, NASA gave a contract to Fairchild to supply chips for the Apollo Guidance Computer. Between 1962 and 1965, the Apollo program was the world’s largest customer for chips [4]. By 1969, the program alone had purchased more than a million integrated circuits. These NASA and Defense Department purchases were huge contracts for the burgeoning semiconductor technology companies. The companies had the benefit of stable income and plowed profits back into research and development.

Such was the beginning of a meteoric rise of an industry broadly known as “High Technology” or “Information Technology.” The hardware companies of Silicon Valley enjoyed rapid growth. Not only did they diversify their products, but the computer and communications revolution spawned entirely new fields of software and social media. These sectors have experienced even greater success. Today, five information technology companies, Apple, Alphabet, Microsoft, Facebook and Amazon, comprise roughly 12% of the total market capitalization of publicly held companies in the US. Software alone accounts for roughly 7% of US GDP. [3]

These technologies have become an indispensible part of daily life throughout the world, and the Apollo program was instrumental in boosting this nascent industry.



[2] T.R. Reid, “The Chip: How Two Americans Invented the Microchip and Launched a Revolution,” (Random House Publishing Group, New York; 1985, 2001); chapter 2. Available as an eBook from Amazon and Apple Books.

[3] Rhys McCarney, “Inventions That Built the Information Technology Revolution,” (Lulu Publishing, 2018); chapters 8 and 16. Available as an eBook from Amazon and Apple Books.

[4] Eldon C. Hall, Journey to the Moon: The History of the Apollo Guidance Computer (American Institute of Aeronautics and Astronautics Inc., 1996).

Photograph provided by courtesy of Pexel.

After completing his Ph.D. in Physics, the author had a 30 year career doing basic research at a premier industrial laboratory and a federal research facility.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store