Member-only story
The Coming Resurgence in US Semiconductor Manufacturing
The US is investing hundreds of billions of dollars to build new chip manufacturing facilities. Here’s what we know about them so far.
The history of semiconductor innovation in the United States dates back to the mid-20th century. In 1947, scientists at Bell Labs developed the first working transistor, which paved the way for creating the first commercially successful integrated circuit in 1958. Semiconductor chips, also known as microchips or integrated circuits, are small devices made of semiconductor materials (typically silicon) that control the flow of electricity and store and process information. They are the building blocks of modern electronics, acting as the “brains” in these electronics.
Over the next few decades, the semiconductor industry in the United States grew rapidly, with companies like Intel and Texas Instruments paving the way in developing new technologies and manufacturing processes. And for many years, the country has been a leader in semiconductor manufacturing, producing a wide range of microchips used in everything from computers to televisions to automobiles.
Today, the US remains a semiconductor powerhouse, accounting for about 47% of the world’s chip industry. However, if we look…