Where sensory experience fails: New methods allow the discovery of nano-worlds

R Schleicher-Tappeser
8 min readSep 23, 2022

--

The history of technology shows: Nuclear power for energy supply is a hopelessly outdated technology / Episode 4/12

Electron microscope Siemens Elmiskop 1, made 1945–1955,© (CC) BY-NC-SA, Science Museum

The first three episodes of this series dealt with the history of nuclear energy. We saw that there were no major, technological innovations in nuclear energy generation since the development of the light water reactor. However, after the Second World War, the discovery of new worlds at the nano-scale took off in various directions. In parallel to the upheavals in physics that had led to a new understanding of matter and energy at the beginning of the last century (see episode 1), new methods began to be developed that allowed for structures the size of atoms and molecules to be studied more closely. As always in science, the methods of investigation continued to evolve alongside new discoveries.

Electricity as the key to new worlds

A prerequisite for the development of these new methods was the intensive study of the mysterious phenomenon of electricity. With Alessandro Volta’s zinc-copper battery, a direct voltage source had been available since 1800. This tool allowed for systematically researching the laws of electricity. Electrolysis was invented in the same year. In 1820, Hans Christian Ørsted discovered that electricity can generate magnetic fields. Michael Faraday then showed that alternating magnetic fields can also generate electricity in reverse. As a result, increasingly suitable electric motors and generators were developed in the 1820s and 1830s. The first industrially used alternating current generator was built in 1849. Connected to water wheels and steam engines, which were still founded on principles that we understand on the basis of macroscopic experience in everyday life, a new form of energy was thus available alongside mechanical and thermal energy, which allowed for entirely new applications. Based on Faraday’s comprehensive investigations, James Clerk Maxwell formulated with his four “Maxwell’s equations” a comprehensive theory of electrodynamics in 1864, which explained all previously known electrical and magnetic phenomena and also the nature of light with the description of “fields” and “waves”. Accordingly, the light was electromagnetic radiation. This laid the foundation for the differentiation and triumph of classical electrical engineering.

From the discovery of the electron to the mass spectrometer

In the second half of the 19th century, extensive experiments were carried out with electrical discharge processes in gas-filled and airless tubes, producing different types of rays depending on the experimental set-up. Of particular interest were the cathode rays, which radiated from a (preferably glowing) cathode towards a positively charged anode and could also be detected outside the tube through suitable windows (this was usually done with photographic plates or fluorescent surfaces). Around 1890, it was discovered that the beam could be deflected by applying an electrical voltage to additional electrodes in the cathode ray tube. Joseph J. Thomson proved in 1897 that these were electrons — negative particles that others had already predicted. He was awarded the Nobel Prize for this discovery of the first subatomic particle.

In 1889, Ferdinand Braun then used the deflection of cathode rays by an electrical voltage to develop the “Braun tube” or CRT (cathode ray tube) — the basis for visual representations on electronic screens. It became a central instrument of man-machine interaction in television sets, all kinds of measuring devices and later also computers, until it was displaced by semiconductor-based flat screens from 2000 onwards. Translating measurement results at the atomic level into images that our senses can perceive was an essential prerequisite for the rapid progress in nanosciences.

Thomson calculated the deflection paths, experimented further with positively charged particle beams from various cathode materials (“channel beams”), and thus developed the foundations for the mass spectrometer. Arthur Jeffrey Dempster built the first highly sensitive device of this kind in 1918, with which the mass of atoms and molecules could be measured very precisely, and the composition of mixtures of substances could be analysed (by transferring them into the gas phase, ionising them, accelerating them and then sorting them with the help of magnetic fields). With this, he succeeded in identifying the uranium isotope 235 in 1935.

From the light to the electron microscope

Ernst Karl Abbe had already demonstrated in 1873 that magnification with conventional light microscopes, which had been perfected since around 1600, was limited by the wavelength of light, and that only objects larger than about 200 nm (nanometre = one-thousandth of a millimetre), the dimension of a typical body cell, could be discerned with it. This raised the question of whether there was electromagnetic radiation with shorter wavelengths that could be used to detect smaller structures.

In 1895, Conrad Röntgen demonstrated that when cathode rays (electrons) hit a metal anode, secondary radiation is emitted that can penetrate materials that are not transparent to light. He recognised the great potential of these X-rays for investigations in medicine and solid-state physics. Today, X-ray microscopy can be used to examine structures that are ten times finer than those discernable with light microscopy. The different interaction with the materials under investigation provides additional insights.

A further leap became possible only after the development of quantum theory between 1900 and 1930 showed that subatomic particles and electromagnetic radiation are merely two different ways of representing the same entities, and electron beams can also be understood as electromagnetic waves (see Part 1 of this blog post). As a result, Ernst Ruska and Max Knoll built the first electron microscope in 1931. With a commercial device from Siemens, viruses could be photographed for the first time in 1938. With today’s electron microscopes, you can see structures of 0.1 nm — the size of atoms.

From prism to MRI

In addition to microscopes, which allow to generate images of structures, it is above all different variants of spectroscopy — in which rays are fanned out into a spectrum of frequencies — that make it possible to obtain information about the properties, fine structure and composition of materials. This includes not only mass spectrometry, already mentioned above, but also the spectral analysis of electromagnetic rays. As early as the 17th century, Newton showed that white sunlight is composed of a spectrum of light of different colours by fanning it out through a prism. In 1859, Kirchhoff and Bunsen discovered that every element in the glowing gaseous state shows characteristic spectral lines, i.e. emits light of very specific wavelengths. This subsequently made it possible to identify individual elements and measure mixtures’ quantitative composition. The phenomenon of sharp spectral lines led to the realisation that only very specific energy states are possible in atoms and ultimately to the development of quantum theory.

Understanding fundamental principles of the structure and internal dynamics of atomic nuclei, atoms and molecules led to a veritable explosion of spectrographic methods for analysing matter from 1930 to even more from 1950. Gradually, it was discovered that the spectral analysis of all kinds of radiation with different excitation methods can provide detailed information about the composition, structure and dynamics of matter — from states in atomic nuclei to the complex structure of large molecules. Today, Wikipedia alone contains entries on around forty different methods.

In addition to the increasing knowledge of the dynamics in the nano-world and the phenomena that can be investigated, microelectronics and digital data processing from the 1960s onwards also led to previously unimaginable improvements in investigation methods.

Four chamber cardiovascular magnetic resonance imaging.gif © Wikimedia Commons

This can be seen in the example of nuclear magnetic resonance spectroscopy, known to almost everyone in medicine today as MRI (magnetic resonance imaging). In 1896, it was discovered that spectral lines further split in a magnetic field (fine structure), which was soon attributed to the fact that an atom represents a spinning top in which electrical charges spin, generating a magnetic field. Quantum theory showed that also this magnetic field can only assume certain states (energy levels). At the end of the 1920s, it was discovered that the nuclei of atoms also exhibit such a magnetically effective rotation (nuclear spin), but their energy is 1000 times smaller than that of the whole atoms. In 1946, E.M. Purcell and Felix Bloch independently showed that energy could be transferred in a strong magnetic field by resonance between an external alternating field and the nuclear spin. The first commercial nuclear spin resonance spectrometer was built in Palo Alto in 1952. With ever stronger magnets — today with the help of superconducting coils — the measurement accuracy of both the nuclear spin and the influence due to the molecular structure in which the nucleus is located could be increased by many decimal places.

Further significant improvements were brought about by using pulsed excitation and new computer-mathematical methods. Thus, nuclear spin spectrometry became an essential tool in chemical structure analysis. In the 1970s and 1980s, the technique was further developed into magnetic resonance tomography with the help of inhomogeneous magnetic fields, enabling 3-dimensional examinations, the calculation of pictorial representations and the investigation of flow processes. This set the stage for a breakthrough for routine applications in medicine, organic chemistry, biochemistry and materials science, which was then primarily enabled by further improvements in computer technology and superconductivity.

The growing toolbox for exploring the nanoworld allows new applications whose origins we are mostly unaware of

These examples show that in the wake of the revolution in the understanding of energy and matter in the first half of the last century, entirely new methods of investigation became available since the Second World War, enabling a broad exploration of the newly opened-up nano-world.

After the discovery of nuclear fission, the military and civilian use of nuclear energy were only a first, today rather crude-looking application in the advance into the new and for our everyday understanding not at all vivid nano-world. Since the middle of the last century, many developments have become possible with the new research methods, the origin and diversity of which we can hardly grasp in everyday life. The public is only aware of their fundamentally new potentials and dangers for the development of humankind in isolated cases.

In this series, this is of particular interest concerning energy supply. In the first half of the last century, natural science was primarily concerned with the relationship between energy and matter. After the Second World War, the importance of information came more and more to the fore. Sometimes we are in danger of forgetting that all three are closely linked, that for all the fascination of dealing with information, dealing with energy and matter remains vital.

Next episode (5/12):
Silicon-based virtual worlds: nanosciences revolutionise information technology

Previous episodes

--

--

R Schleicher-Tappeser

SUSTAINABLE STRATEGIES. Writes about Technology and Society: Based in Berlin. Five decades of experience in energy, transport, climate, innovation policies.