How the world’s most important industry ended up on a single island: Part II
Taiwan producing almost all advanced chips today is a geopolitical disaster. If something very concrete and expected were to happen to the island, it would not only drag the US into a war but also halt the global economy and destroy the modern standard of living. What’s most outrageous about this situation is that the reason it came to be is not even purely political (which is possible to accept) but rather psychological and cultural. At least, that’s the impression I got from reading “Chip War: The Fight for the World’s Most Critical Technology.” The book tells the story of strong-willed Asian autocrats and geeky Silicon Valley CEOs who gained the power to determine American foreign policy with their gadgets.
The first part covered the period of establishing the market and the industry in the US and Japan
Consequences of the Vietnam War
When American troops retreated from Vietnam in 1975, Asian leaders’ perspectives on the event differed from the narrative commonly heard in the West. Rather than perceiving it as America’s weakness, they saw it as America’s nascent lack of willingness to unconditionally continue to support regimes in the region against the communists. Every American-backed government sought to increase the stakes that the US had in their stability, particularly Taiwan. These governments identified semiconductor production as a perfect solution for several goals:
- Semiconductors are a vital product for the American military, making territories with large production sites a matter of national security.
- Developing countries have an advantage due to their large yet unindustrialised populations. For these populations, transitioning from subsistence farming to a factory salary of a few dollars can be positively life-changing.
- Improvements in the standard of living will hopefully deter people from communism and make them loyal to the American-backed government.
Meanwhile, American companies sought to establish their independence from the Pentagon and defence contracts. Although still a significant source of income, these contracts limited the direction of growth and consumer market expansion. Breaking free proved difficult, expensive, and risky.
When approached by Asian governments and/or government-backed companies, Americans were surprised by the conditions they offered. American companies could continue to specialise and advance the most technological parts of the industry, while everything less exciting could be assembled in Asia for a fraction of the cost. Moreover, these companies could enjoy privileges that would be considered obscene in the US without any liability.
By the end of the 1970s, American semiconductor firms employed tens of thousands of workers internationally, mostly in Korea, Taiwan, and Southeast Asia. A new international alliance emerged between Texan and Californian chipmakers, Asian autocrats, and the often ethnic-Chinese workers who staffed many of Asia’s semiconductor assembly facilities. (…) From South Korea to Taiwan, Singapore to the Philippines a map of semiconductor assembly facilities looked much like a map of American military bases across Asia.
The fall of the American production
American industry continued to make advances since that was their primary goal. In 1970, Intel released the first commercial memory (Dynamic Random-Access Memory) chip. Memory is pretty important for complex calculations, and this was a significant boost to the computing power of modern devices.
(…) five years after the 64K DRAM chip was introduced, Intel — the company that had pioneered DRAM chips a decade earlier — was left with only 1.7 percent of the global DRAM market.
It turned out that the same reasons that led individual American companies to move their production to Asia were the reasons why these companies couldn’t compete as a whole: cheap money, unconditional government support and medieval-like privileges for the industry, and cheap labour. A few years later, many US companies found themselves bankrupt. And in panic, American CEOs demanded the imposition of tariffs on Japanese products.
The government, which had been repeatedly rejected by the industry before, started asking questions such as “Why are semiconductors more strategic than steel, which is also suffering from Japanese competition?” and “Can’t you just buy Japanese chips and make better computers if they are so cheap?”
This period was incredibly frustrating; Americans were paranoid, and the Japanese were arrogant and indulgent in nationalistic joy. People who longed for the Japanese Empire and were torn by humiliation in WW2 figured out that American military strength was presently held in the hands of Japanese companies with heavy associations with the government. That’s as close to “holding by the balls” as a situation can get.
Trade negotiators compared negotiating with the Japanese to peeling an onion. “The whole thing is a rather zen experience,” one U.S. trade negotiator reported, with discussions ending with philosophical questions like “what is an onion, anyway.”
Meanwhile, American culture was flooded with dark Cyberpunk worlds where Japanese mega-corporations now ruled over people’s bodies.
American companies were in distress and crying out for more help from Congress. They requested the adoption of policies similar to those enjoyed by Japanese companies, such as cheap money, vertical integration, and government assistance in guiding the industry. Many believed that American companies were failing because they were not like Japanese companies (very smart, huh). The government didn’t give any financial assistance to the industry due to its efforts to fight inflation in the 1980s. Although the government did start supporting vertical integration and joint R&D, the companies refused to share any technology or even disclose what they were working on to the competitors, not to give out good ideas for free.
In 1989, the co-founder of Sony cooperated with extreme right-wing nationalists on the book “The Japan That Can Say No”, which argued that it was time for Japan to stand up to the US and use semiconductors as the main leverage. They even argued that the American nuclear weapons were useless if they couldn’t fire accurately without the Japanese chips. It hurt bad…
In 1986, with the threat of tariffs looming, Washington and Tokyo cut a deal. Japan’s government agreed to put quotas on its exports of DRAM chips, limiting the number that were sold to the U.S. By decreasing supply, the agreement drove up the price of DRAM chips everywhere outside of Japan, to the detriment of American computer producers, which were among the biggest buyers of Japan’s chips. Higher prices actually benefitted Japan’s producers, which continued to dominate the DRAM market.
During the US’s bizarre tariff battle with Japan, Korea inserted itself into the gap created by the quotas. As a result, the Americans lost, the Japanese maintained their dominance, and Samsung became the giant it is today.
There was one American company that treated its Asian counterparts differently. Intel held a joint venture with Samsung and sold chips that Samsung manufactured under Intel’s own brand. This made Intel one of the few US companies to benefit from the crisis, giving them the resources to reinvent Silicon Valley.
The Revival
There were two major tracks to the US exiting the crisis: the Personal Computer boom and the specialisation in manufacturing equipment and chip design.
IBM released the first-ever personal computer in 1981, and it doesn’t need much explanation to understand what happened next. It was a consumer product, like the flagships that the Japanese were producing, and Americans finally got the hang of business processes. Despite the difficulty with integration in the early ’80s, by the ’90s, the industry had complex networks of specialisations where companies did not compete vertically, only horizontally. This became possible only after a long period of desperation and life-or-death decisions.
Except for Apple’s computers, almost every PC used Intel’s chips and Windows software, both of which had been designed to work smoothly together. Intel entered the personal computer era with a virtual monopoly on chip sales for PCs.
High-quality computers with cutting-edge technology and features dominated the personal computer market in Japan. However, their high price tags limited their appeal to a smaller market segment. This allowed American companies like IBM and Compaq to gain a foothold in the Japanese market by offering more affordable options. Ultimately, economic problems made Japan irrelevant.
Another approach for American companies was to specialise in everything concerning production. With Intel designing equipment for manufacturing plants, some companies began specialising in software for such equipment. Conway and Mead developed a design principle that allowed every electronics company to create their own chip design from a catalogue of elements.
By the early 2010s, the most advanced microprocessors had a billion transistors on each chip. The software capable of laying out these transistors was provided by three American firms, Cadence, Synopsys, and Mentor, which controlled around three-quarters of the market. It was impossible to design a chip without using at least one of these firms’ software. Moreover, most of the smaller firms providing chip design software were U.S.-based, too.
Coming to Taiwan
TSMC, the Taiwanese giant, is the result of this recent shift towards accessible customisation and Lego-like building principles. US firms design chips for their electronics and send the blueprints to TSMC facilities, which manufacture whatever the Americans request.
TSMC, like other Asian mega-corporations, was created as a project of the Taiwanese government, but it also received support from Silicon Valley entrepreneurs. The company’s success as a primary manufacturing facility for American designs can be attributed to two factors, both of which boil down to trust:
- In the context of American CEOs still being anxious about the Cold War, Taiwan has shown unwavering loyalty to the US and built its identity in opposition to communist China. In contrast, the Japanese have always seemed too disinterested in the conflict and have threatened to sell technology to the Soviets on numerous occasions.
- In the context of technology theft, TSMC’s advantage was that it did not produce its own electronics. It was difficult for American companies to trust Samsung with manufacturing chips for their mobile phones. If the design gets stolen, they lose millions in R&D investments. With TSMC, such risks were not present. The company upheld its reputation flawlessly and gained a Swiss bank-like aura when dealing with customers’ secrets.
So that’s the story: Americans lost competitiveness in manufacturing and consumer goods to the Japanese. And when rebuilding the industry with an approach that did not require an in-house production site, they saw Taiwan as the only viable option.
Thoughts
Many of the mistakes made by Americans can be attributed to their holding onto abstract ideas and industry culture. They did not notice when Japan was taking over the consumer market because they believed it was impossible for Americans to lose at marketing. They persisted in the DRAM market, convinced that a little more R&D and technology would overcome faulty business processes. The idea that firms do not need to conjure up capital in a contracting economy to build another production site did not become prevalent until the 90s. Of course, things always look simple in hindsight, but this particular story suggests that thinking faster and leaving unproductive convictions behind can save an industry.
Support Kugel Books, Buy on Amazon