Physics

Robert Mundinger
CodeParticles
Published in
9 min readFeb 2, 2018

“It was becoming increasingly difficult to separate the atomic age from the information age”

bomb (left), bombe (right)

Historian George Dyson said “Computers led to bombs, and bombs led to computers.” This statement came from the fact that women ‘computers’ helped simulate how the atomic bomb would function by doing complex mathematical equations. In times of war, this led others in time of war to create the foundations for computers. Ironically, perhaps the most famous early computer was called the ‘bombe’ — the one created by Alan Turing to break the code used by the German enigma machine.

“the unveiling of the two most important technologies of the 20th century — the atomic bomb and the transistor — occurred almost exactly 3 years apart”

But all of this technology doesn’t ‘just happen’. It isn’t magic. It isn’t witchcraft or sorcery. Criss Angel, mindfreak and David Blaine probably own computers, but theirs operate in just the same way as ours.

It happens through a combination of physics and chemistry, even if any great physicist laughs at chemistry.

“All science is either physics or stamp collecting” — Ernest Rutherford

science

Elements

Our understanding of the most basic building blocks of matter is still relatively new. Our understanding of oxygen is about the same age as America, having been discovered around 1774, while the Periodic Table was created around the time Ulysses S. Grant was drunkenly beginning his time as the President of the United States and Charles Darwin was publishing On the Origin of Species in 1869.

The stories of how most of these elements were discovered are quite hilarious, along with reasons for their discoveries. Many of them are chronicled in the book A Short History of Nearly Everything which describes in great detail some of those responsible for these discoveries. Some were trying to turn urine into gold, others killed themselves by tasting their own concoctions while even more had parties involving laughing gas (sounds like a fun time).

But this modeling of the structure of atoms allowed later scientists to understand the properties of elements, and which elements would allow for the transmission of electrons between them.

Scientists also experimented with combining elements in all sorts of interesting ways.

After the periodic table was constructed, it was full of many holes where there were ‘supposed’ to be elements, but they hadn’t been discovered yet. They have all since been found — the last of which was atomic number 87 — Francium — in 1939, but many of them remained useless until relatively recently:

Since about 1950, every metal has found a niche. Gadolinium is perfect for magnetic resonance imaging (MRI). Neodymium makes unprecedentedly powerful lasers. Scandium, now used as a tungsten-like additive in aluminium baseball bats and bike frames, helped the Soviet Union make lightweight helicopters in the 1980s and purportedly even topped Soviet ICBM missiles stored underground in the Arctic, to help the nukes punch through sheets of ice. — The Disappearing Spoon

Technology

Advances in our understanding of the elements allowed us to expand into greater depths in experimentation with metallurgy, doping, purity and mixing of elements to produce different compounds. Now it wasn’t just Johnny Tremain blacksmithing away shaping hot metal, but we could mix smaller elements together in beakers and labs and created gasoline, gunpowder, drugs and toyed with light, conductors, electricity, fiber optics, copper.

Like so much innovation, the modeling of the elements of the periodic table in the late 1800’s led to a revolution in experimentation with mixing and combining them to create and morph matter in the first half of the 1900’s. As explained in ‘The Idea Factory: Bell Labs and the Great Age of American Innovation’:

“the process allowed the Labs’ metallurgists to fabricate the purest materials in the history of the world — germanium that had perhaps one atom of impurity among 100 million atoms. If that was too hard to envision, the Labs executives had a handy analogy to make it even more clear. The purity of the materials produced at Bell Labs, in the 1950s, was akin to a pinch of salt sprinkled amid a 38 car freight train carrying it its box cars nothing else but sugar” — Jon Gertner, ‘The Idea Factory’

transistors

The earliest computing devices used mechanical relays or vacuum tubes as the electronic gates on which computing was built, but at Bell Labs in New Jersey they were experimenting with new ways of utilizing elements to achieve the same job, much more efficiently.

The advances in quantum theory came at the same time that metallurgists at Bell Labs were finding ways to create new materials using novel purification techniques, chemical tricks, and recipes for combining rare and ordinary minerals. In seeking to solve some everyday problems, like vacuum-tube filaments that burned out too quickly or telephone-speaker diaphragms that sounded too tinny, they were mixing new alloys and developing methods to heat or cool concoctions until they performed better. By trial and error, like cooks in a kitchen, they were creating a revolution in materials science that would go hand in hand with the theoretical revolution that was occurring in quantum mechanics. — The Innovators

On December 16th, 1947 (semiconductors)

A transistor can be thought of as a switch that can be turned on or off, which makes it a perfect product to be used to implement the Boolean Logic gates theorized by Claude Shannon, earlier described in Logic (and the inventor of Information Theory).

“In the transistor and the new solid-state electronics, man may hope to find a brain to match atomic energy’s muscle”

The transistor is widely thought to be one of, if not the greatest inventions of the entire 20th century.

integrated circuit

Transistors are all well and good, but they have to be connected in order to form more complex components that provide the logic underpinning our current devices.

In the late 1950’s in Dallas, Jack Kilby began experimenting with better ways to connect transistors at Texas Instruments and came up with what was later dubbed the ‘Monolithic Idea’:

multiple transitors could be placed on the same silicon chip, and the printed copper wires could connect them into a circuit.

This may not seem very interesting, but it’s the foundation of all the computing you use today…the microchip. It’s also why it’s called Silicon Valley.

The Integrated Circuit greatly reduced the size and greatly increased the efficiency and speed of computing devices and the market for them exploded in the 1960’s with transistor radios, calculator and all sorts of other portable, small electronic devices. Since its creation, we have only gotten better and better at cramming many transistors onto smaller chips. Moore’s law states that the number of transistors we can put on a microchip doubles every 2 years, which has allowed chips to get smaller and be cheaper to manufacture. A computer that cost millions of dollars and weighed about 2 tons in the 1940’s is like an abacus compared to the supercomputer phone you have right now that may cost around $500 and fit in your pocket. The microchip currently routing the Voyager into the outer reaches of our solar system is about 200,000 times slower than an iPhone and has 250,000 times less memory.

Put simply, the iPhone 6’s clock is 32,600 times faster than the best Apollo era computers and could perform instructions 120,000,000 times faster. You wouldn’t be wrong in saying an iPhone could be used to guide 120,000,000 Apollo era spacecraft to the moon, all at the same time.

Most of these early computing chips were general purpose — the kind you would find in a walkie talkie, calculator or walking teddy bear. But the integrated circuit led to the microprocessor — not a chip designed for one specific task, but a general purpose chip. If one could be designed, companies would no longer need to contract out for custom chips, but one single chip could perform all varieties of functions that would replace all those custom chips.

Such a chip was invented at Intel an introduced in November 1971 — the Intel 4004.

Like so many technologies, this one invention spawned a whole new wave of innovation — it launched the software industry, who would program against the instructions built into this general purpose machine, where any computer can mimic any other — the vision of Alan Turing had come to reality.

And this is where we currently stand. Chips have gotten smaller and cheaper, we’ve networked computers together and come up with better algorithms, but in a sense we are still simply building on top of programmable circuit, created in 1971.

Quantum Computing

This could very well change in the near future with headway being made in the field of Quantum Computing. Computing is currently implemented using electricity on silicon transistors, but this is not the only theoretical implementation, not by a long shot.

A frighteningly simple explanation is that there is a theoretical way we could manipulate quantum bits of information using particles smaller than electrons, so called quanta with names like quarks, leptons, bosons. Each can have several different spinning ‘states’ at once, each able to represent information. I won’t delve much deeper, I’ll let Justin Trudeau explain for me:

And here’s another few good summaries. The implications would be enormous and probably a bit scary:

If scientists could build a quantum computer, it would be able to perform calculations with such enormous speed that it would make a modern supercomputer look like a broken abacus. — The Code Book

For the moment most people are terrified that Quantum Computers could easily crack any and all encryption algorithms we currently have available to us. From techcrunch

The National Security Agency, too, has sounded the alarm on the risks to cybersecurity in the quantum computing age. The NSA’s “Commercial National Security Algorithm Suite and Quantum Computing FAQ” says that “many experts predict a quantum computer capable of effectively breaking public key cryptography” within “a few decades,” and that the time to come up with solutions is now.

These ideas are getting less theoretical and more practical:

The elements of quantum computing have been around for decades, but it’s only in the past few years that a commercial computer that could be called “quantum” has been built by a company called D-Wave. Announced in January, the D-Wave 2000Q can “solve larger problems than was previously possible, with faster performance, providing a big step toward production applications in optimization, cybersecurity, machine learning and sampling.”

And from The Next Web:

Processors for quantum devices are measured in qubits, with today’s most advanced ones coming in at around 50 qubits. At this size they’re the equivalent to a supercomputer. At just 60 qubits, it would exceed the power of every supercomputer on the planet combined, and then some.

I will only say it’s likely that our current definition of computing will get significantly more complex in the future. How far into the future? Not sure, but let’s find out:

Us, finding out

--

--