QT/ Quantum physics: Superconducting nanowires detect single protein ions

Paradigm
Paradigm
Published in
30 min readDec 19, 2023

Quantum news biweekly vol.65, 4th December — 19th December

TL;DR

  • An international research team has achieved a breakthrough in the detection of protein ions: Due to their high energy sensitivity, superconducting nanowire detectors achieve almost 100% quantum efficiency and exceed the detection efficiency of conventional ion detectors at low energies by a factor of up to a 1,000. In contrast to conventional detectors, they can also distinguish macromolecules by their impact energy. This allows for more sensitive detection of proteins and it provides additional information in mass spectrometry.
  • A radical theory that consistently unifies gravity and quantum mechanics while preserving Einstein’s classical concept of spacetime was announced by physicists.
  • Two nanotechnology approaches converge by employing a new generation of fabrication technology. It combines the scalability of semiconductor technology with the atomic dimensions enabled by self-assembly.
  • Researchers have produced the first theoretical demonstration that the magnetic state of an atomically thin material, ?-RuCl3, can be controlled solely by placing it into an optical cavity. Crucially, the cavity vacuum fluctuations alone are sufficient to change the material’s magnetic order from a zigzag antiferromagnet into a ferromagnet.
  • A recent study by European scientists shows that highly sensitive sensors based on color centers in a diamond can be used to record electrical activity from neurons in living brain tissue.
  • A team of physicists, computer scientists and information machine specialists has created a quantum computer with the largest-ever number of logical quantum bits. In their paper, the group describes how they built their computer and how well it performed when tested.
  • Physicists at the University of Regensburg have found a way to manipulate the quantum state of individual electrons using a microscope with atomic resolution.
  • On the highway of heat transfer, thermal energy is moved by way of quantum particles called phonons. But at the nanoscale of today’s most cutting-edge semiconductors, those phonons don’t remove enough heat. That’s why researchers are focused on opening a new nanoscale lane on the heat transfer highway by using hybrid quasiparticles called “polaritons.”
  • Physicists suggests that the Wiedemann-Franz law should approximately hold for one type of quantum material, the cuprate superconductors.
  • Researchers discover a new type of ultrafast magnetic switching by investigating fluctuations that normally tend to interfere with experiments as noise.
  • And more!

Quantum Computing Market

According to the recent market research report ‘Quantum Computing Market with COVID-19 impact by Offering (Systems and Services), Deployment (On Premises and Cloud Based), Application, Technology, End-use Industry and Region — Global Forecast to 2026’, published by MarketsandMarkets, the Quantum Computing market is expected to grow from USD 472 million in 2021 to USD 1,765 million by 2026, at a CAGR of 30.2%. The early adoption of quantum computing in the banking and finance sector is expected to fuel the growth of the market globally. Other key factors contributing to the growth of the quantum computing market include rising investments by governments of different countries to carry out research and development activities related to quantum computing technology. Several companies are focusing on the adoption of QCaaS post-COVID-19. This, in turn, is expected to contribute to the growth of the quantum computing market. However, stability and error correction issues are expected to restrain the growth of the market.

According to ‘Quantum Computing Market Research Report: By Offering, Deployment Type, Application, Technology, Industry — Industry Share, Growth, Drivers, Trends and Demand Forecast to 2030’ report, the quantum computing market is projected to reach $64,988 million by 2030. Machine learning (ML) is expected to progress at the highest CAGR, during the forecast period, among all application categories, owing to the fact that quantum computing is being integrated in ML for improving the latter’s use case.

Latest Research

Highly sensitive single-molecule detection of macromolecule ion beams

by Marcel Strauß, Armin Shayeghi, Martin F. X. Mauser, Philipp Geyer, et al in Science Advances

An international research team led by quantum physicist Markus Arndt (University of Vienna) has achieved a breakthrough in the detection of protein ions: Due to their high energy sensitivity, superconducting nanowire detectors achieve almost 100% quantum efficiency and exceed the detection efficiency of conventional ion detectors at low energies by a factor of up to a 1,000. In contrast to conventional detectors, they can also distinguish macromolecules by their impact energy. This allows for more sensitive detection of proteins and it provides additional information in mass spectrometry.

The detection, identification, and analysis of macromolecules is interesting in many areas of life sciences, including protein research, diagnostics, and analytics. Mass spectrometry is often used as a detection system — a method that typically separates charged particles (ions) according to their mass-to-charge-ratio and measures the intensity of the signals generated by a detector. This provides information about the relative abundance of the different types of ions and therefore the composition of the sample.

However, conventional detectors have only been able to achieve high detection efficiency and spatial resolution for particles with high impact energy — a limitation that has now been overcome by an international team of researchers using superconducting nanowire detectors. In the current study, a European consortium coordinated by the University of Vienna, with partners in Delft (Single Quantum), Lausanne (EPFL), Almere (MSVision) and Basel (University), demonstrates for the first time the use of superconducting nanowires as excellent detectors for protein beams in so-called quadrupole mass spectrometry.

Quadrupole mass spectrometry with superconducting single particle detection.

Ions from the sample to be analyzed are fed into a quadrupole mass spectrometer where they are filtered.

“If we now use superconducting nanowires instead of conventional detectors, we can even identify particles that hit the detector with low kinetic energy,” explains project leader Markus Arndt from the Quantum Nanophysics Group at the Faculty of Physics at the University of Vienna.

This is made possible by a special material property (superconductivity) of the nanowire detectors. The key to this detection method is that nanowires enter a superconducting state at very low temperatures, in which they lose their electrical resistance and allow lossless current flow. Excitation of the superconducting nanowires by incoming ions causes a return to the normal conducting state (quantum transition). The change in the electrical properties of the nanowires during this transition is interpreted as a detection signal.

“With the nanowire detectors we use,” says first author Marcel Strauß, “we exploit the quantum transition from the superconducting to the normal conducting state and can thus outperform conventional ion detectors by up to three orders of magnitude.”

Influence of energy/mass/momentum/structure on the detection mechanism.

Indeed, nanowire detectors have a remarkable quantum yield at exceptionally low impact energies — and redefine the possibilities of conventional detectors: “In addition, a mass spectrometer adapted with such a quantum sensor can not only distinguish molecules according to their mass to charge state, but also classify them according to their kinetic energy. This improves the detection and offers the possibility for have better spatial resolution,” says Marcel Strauß. Nanowire detectors can find new applications in mass spectrometry, molecular spectroscopy, molecular deflectometry, or quantum interferometry of molecules, where high efficiency and good resolution are required, especially at low impact energy.

Gravitationally induced decoherence vs space-time diffusion: testing the quantum nature of gravity

by Jonathan Oppenheim, Carlo Sparaciari, Barbara Šoda, Zachary Weller-Davies in Nature Communications

A radical theory that consistently unifies gravity and quantum mechanics while preserving Einstein’s classical concept of spacetime is announced by UCL (University College London) physicists.

Modern physics is founded upon two pillars: quantum theory on the one hand, which governs the smallest particles in the universe, and Einstein’s theory of general relativity on the other, which explains gravity through the bending of spacetime. But these two theories are in contradiction with each other and a reconciliation has remained elusive for over a century.

The prevailing assumptionhas been that Einstein’s theory of gravity must be modified, or “quantised,” in order to fit within quantum theory. This is the approach of two leading candidates for a quantum theory of gravity, string theory and loop quantum gravity. But a new theory, developed by Professor Jonathan Oppenheim (UCL Physics & Astronomy), challenges that consensus and takes an alternative approach by suggesting that spacetime may be classical — that is, not governed by quantum theory at all.

Instead of modifying spacetime, the theory — dubbed a “postquantum theory of classical gravity” — modifies quantum theory and predicts an intrinsic breakdown in predictability that is mediated by spacetime itself. This results in random and violent fluctuations in spacetime that are larger than envisaged under quantum theory, rendering the apparent weight of objects unpredictable if measured precisely enough.

A second paper, led by Professor Oppenheim’s former PhD students,looks atsome of the consequences of the theory, and proposes an experiment to test it: to measure a mass very precisely to see if its weight appears to fluctuate over time. For example, the International Bureau of Weights and Measures in France routinely weigh a 1kg mass which used to be the 1kg standard. If the fluctuations in measurements of this 1kg mass are smaller than required for mathematical consistency, the theory can be ruled out.

The outcome of the experiment, or other evidence emerging which would confirm the quantum vs classical nature of spacetime, is the subject of a 5000:1 odds bet between Professor Oppenheim and Professor Carlo Rovelli and Dr Geoff Penington — leading proponents of quantum loop gravity and string theory respectively. For the past five years, the UCL research group has been stress-testing the theory, and exploring its consequences.

Professor Oppenheim said: “Quantum theory and Einstein’s theory of general relativity are mathematically incompatible with each other, so it’s important to understand how this contradiction is resolved. Should spacetime be quantised, or should we modify quantum theory, or is it something else entirely? Now that we have a consistent fundamental theory in which spacetime does not get quantised, it’s anybody’s guess.”

Co-author Zach Weller-Davies, who as a PhD student at UCL helped develop the experimental proposal and made key contributions to the theory itself, said: “This discovery challenges our understanding of the fundamental nature of gravity but also offers avenues to probe its potential quantum nature.

“We have shown that if spacetime doesn’t have a quantum nature, then there must be random fluctuations in the curvature of spacetime which have a particular signature that can be verified experimentally.

“In both quantum gravity and classical gravity, spacetime must be undergoing violent and random fluctuations all around us, but on a scale which we haven’t yet been able to detect. But if spacetime is classical, the fluctuations have to be larger than a certain scale, and this scale can be determined by another experiment where we test how long we can put a heavy atom in superposition* of being in two different locations.”

Co-authors Dr Carlo Sparaciari and Dr Barbara Šoda, whose analytical and numerical calculations helped guide the project, expressed hope that these experiments could determine whether the pursuit of a quantum theory of gravity is the right approach.

Dr Šoda (formerly UCL Physics & Astronomy, now at the Perimeter Institute of Theoretical Physics, Canada) said: “Because gravity is made manifest through the bending of space and time, we can think of the question in terms of whether the rate at which time flows has a quantum nature, or classical nature. “And testing this is almost as simple as testing whether the weight of a mass is constant, or appears to fluctuate in a particular way.”

Dr Sparaciari (UCL Physics & Astronomy) said: “While the experimental concept is simple, the weighing of the object needs to be carried out with extreme precision. “But what I find exciting is that starting from very general assumptions, we can prove a clear relationship between two measurable quantities — the scale of the spacetime fluctuations, and how long objects like atoms or apples can be put in quantum superposition of two different locations. We can then determine these two quantities experimentally.”

The proposal to test whether spacetime is classical by looking for random fluctuations in mass is complementary to another experimental proposal which aims to verify the quantum nature of spacetime by looking for something called “gravitationally mediated entanglement.”

Professor Sougato Bose (UCL Physics & Astronomy), who was not involved with the announcement today, but was among those to first propose the entanglement experiment, said: “Experiments to test the nature of spacetime will take a large-scale effort, but they’re of huge importance from the perspective of understanding the fundamental laws of nature. I believe these experiments are within reach — these things are difficult to predict, but perhaps we’ll know the answer within the next 20 years.”

The postquantum theory has implications beyond gravity. The infamous and problematic “measurement postulate” of quantum theory is not needed, since quantum superpositions necessarily localise through their interaction with classical spacetime. The theory was motivated by Professor Oppenheim’s attempt to resolve the black hole information problem. According to standard quantum theory, an object going into a black hole should be radiated back out in some way as information cannot be destroyed, but this violates general relativity, which says you can never know about objects that cross the black hole’s event horizon. The new theory allows for information to be destroyed, due to a fundamental breakdown in predictability.

Self-assembled photonic cavities with atomic-scale confinement

by Ali Nawaz Babar, Thor August Schimmell Weis, Konstantinos Tsoukalas, Shima Kadkhodazadeh, Guillermo Arregui, Babak Vosoughi Lahijani, Søren Stobbe in Nature

A central goal in quantum optics and photonics is to increase the strength of the interaction between light and matter to produce, e.g., better photodetectors or quantum light sources. The best way to do that is to use optical resonators that store light for a long time, making it interact more strongly with matter. If the resonator is also very small, such that light is squeezed into a tiny region of space, the interaction is enhanced even further. The ideal resonator would store light for a long time in a region at the size of a single atom.

Physicists and engineers have struggled for decades with how small optical resonators can be made without making them very lossy, which is equivalent to asking how small you can make a semiconductor device. The semiconductor industry’s roadmap for the next 15 years predicts that the smallest possible width of a semiconductor structure will be no less than 8 nm, which is several tens of atoms wide. The team behind a new paper, Associate Professor Søren Stobbe and his colleagues at DTU Electro demonstrated 8 nm cavities last year, but now they propose and demonstrate a novel approach to fabricate a self-assembling cavity with an air void at the scale of a few atoms.

To briefly explain the experiment, two halves of silicon structures are suspended on springs, although in the first step, the silicon device is firmly attached to a layer of glass. The devices are made by conventional semiconductor technology, so the two halves are a few tens of nanometers apart. Upon selective etching of the glass, the structure is released and now only suspended by the springs, and because the two halves are fabricated so close to each other, they attract due to surface forces. By carefully engineering the design of the silicon structures, the result is a self-assembled resonator with bowtie-shaped gaps at the atomic scale surrounded by silicon mirrors.

“We are far from a circuit that builds itself completely. But we have succeeded in converging two approaches that have been travelling along parallel tracks so far. And it allowed us to build a silicon resonator with unprecedented miniaturization,” says Søren Stobbe.

Deterministic in-plane self-assembly of suspended silicon platforms by surface forces.

One approach — the top-down approach — is behind the spectacular development we have seen with silicon-based semiconductor technologies. Here, crudely put, you go from a silicon block and work on making nanostructures from them. The other approach — the bottom-up approach — is where you try to have a nanotechnological system assemble itself. It aims to mimic biological systems, such as plants or animals, built through biological or chemical processes. These two approaches are at the very core of what defines nanotechnology. But the problem is that these two approaches were so far disconnected: Semiconductors are scalable but cannot reach the atomic scale, and while self-assembled structures have long been operating at atomic scales, they offer no architecture for the interconnects to the external world.

“The interesting thing would be if we could produce an electronic circuit that built itself — just like what happens with humans as they grow but with inorganic semiconductor materials. That would be true hierarchical self-assembly. We use the new self-assembly concept for photonic resonators, which may be used in electronics, nanorobotics, sensors, quantum technologies, and much more. Then, we would really be able to harvest the full potential of nanotechnology. The research community is many breakthroughs away from realizing that vision, but I hope we have taken the first steps,” says Guillermo Arregui, who co-supervised the project.

Supposing a combination of the two approaches is possible, the team at DTU Electro set out to create nanostructures that surpass the limits of conventional lithography and etching despite using nothing more than conventional lithography and etching. Their idea was to use two surface forces, namely the Casimir force for attracting the two halves and the van der Waals force for making them stick together. These two forces are rooted in the same underlying effect: quantum fluctuations (see Fact box).

The researchers made photonic cavities that confine photons to air gaps so small that determining their exact size was impossible, even with a transmission electron microscope. But the smallest they built are of a size of 1–3 silicon atoms.

“Even if the self-assembly takes care of reaching these extreme dimensions, the requirements for the nanofabrication are no less extreme. For example, structural imperfections are typically on the scale of several nanometers. Still, if there are defects at this scale, the two halves will only meet and touch at the three largest defects. We are really pushing the limits here, even though we make our devices in one of the very best university cleanrooms in the world,” says Ali Nawaz Babar, a PhD student at the NanoPhoton Center of Excellence at DTU Electro and first author of the new paper.

“The advantage of self-assembly is that you can make tiny things. You can build unique materials with amazing properties. But today, you can’t use it for anything you plug into a power outlet. You can’t connect it to the rest of the world. So, you need all the usual semiconductor technology for making the wires or waveguides to connect whatever you have self-assembled to the external world.”

Controlling the magnetic state of the proximate quantum spin liquid α-RuCl3 with an optical cavity

by Emil Viñas Boström, Adithya Sriram, Martin Claassen, Angel Rubio in npj Computational Materials

Researchers in Germany and the USA have produced the first theoretical demonstration that the magnetic state of an atomically thin material, α-RuCl3, can be controlled solely by placing it into an optical cavity. Crucially, the cavity vacuum fluctuations alone are sufficient to change the material’s magnetic order from a zigzag antiferromagnet into a ferromagnet.

A recent theme in material physics research has been the use of intense laser light to modify the properties of magnetic materials. By carefully engineering the laser light’s properties, researchers have been able to drastically modify the electrical conductivity and optical properties of different materials. However, this requires continuous stimulation by high-intensity lasers and is associated with some practical problems, mainly that it is difficult to stop the material from heating up. Researchers are therefore looking for ways to gain similar control over materials using light, but without employing intense lasers.

Now theoreticians at the Max Planck Institute for the Structure and Dynamics of Matter (MPSD) in Hamburg, Germany, Stanford University and the University of Pennsylvania (both in the USA) have come up with a fundamentally different approach to change a real material’s magnetic properties in a cavity — without the use of any laser light. Their collaboration shows that the cavity alone is enough to turn the zigzag antiferromagnet α-RuCl3 into a ferromagnet.

Magnetic phases of the photo ground state.

Crucially, the team demonstrates that even in an apparently dark cavity, α-RuCl3 senses modifications of the electromagnetic environment and changes its magnetic state accordingly. This is a purely quantum mechanical effect, arising from the fact that within quantum theory the empty cavity (technically called the vacuum state) is never really empty. Instead, the light field fluctuates so that light particles pop in and out of existence which, in turn, affects the properties of the material.

“The optical cavity confines the electromagnetic field to a very small volume, thereby enhancing the effective coupling between the light and the material,” explains lead author Emil Viñas Boström, a postdoctoral researcher in the MPSD Theory Group.

“Our results show that carefully engineering the vacuum fluctuations of the cavity electric field can lead to drastic changes in a material’s magnetic properties.” As no light excitation is needed, the approach in principle circumvents the problems associated with continuous laser driving.

This is the first work demonstrating such cavity control over magnetism in a real material, and follows previous investigations into cavity control of ferroelectric and superconducting materials. The researchers hope that designing specific cavities will help them realize new and elusive phases of matter, and to better understand the delicate interplay between light and matter.

Microscopic-scale magnetic recording of brain neuronal electrical activity using a diamond quantum sensor

by Nikolaj Winther Hansen et al in Scientific Reports

A recent study by European scientists shows that highly sensitive sensors based on color centers in a diamond can be used to record electrical activity from neurons in living brain tissue.

Before people encounter symptoms of brain diseases such as dementia, slight changes have usually occurred already in the brain tissue. It may be that parts of the brain are swelling up or clumps of proteins are forming. These small changes might influence how nerve cells in the brain signal each other and communicate, how information is processed and memorized.

Medical scientists want to study these minor changes that occur in the very early stages of a disease. That way, the intention is to learn more about the causes of the disease to provide new insights and more efficient treatments. Today, microscopic studies on the brain are performed with one of two strategies: Optical inspection of brain tissue samples from animals or deceased patients that suffer from the studied disease or measurements of the signals from the nerve cells using wires, coloring, or light. These methods, however, have some limitations: They may damage the tissue or change the signals. Also, they may work differently depending on what tissue you are studying; signals from some parts of the nerve cells involved in a particular disease may be hard to measure.

Schematic of the sensor operation (not to scale), where green laser light directed to subsurface colour centres (NV) in the diamond enables recording of magnetic field arising from compound action potentials (cAP) in a brain tissue slice placed above the diamond.

Scientists from DTU, the University of Copenhagen, Copenhagen University Hospital, Université Sorbonne, and Leipzig University have found a way to measure the signals from brain tissue without touching or inserting needle probes into it. They do so by measuring weak magnetic fields produced by the nerve cells when communicating. In doing so, they made use of the fact that the magnetic field travels through the tissue unchanged.

“Overall, the idea is that sensing the magnetic field ultimately is non-invasive. You do not need to insert electrodes or probes or stain the tissue you want to analyze. Because one picks up the induced magnetic field, one obtains information about the activity without inserting a physical sensor into the system or otherwise modifying it,” says Alexander Huck, Associate Professor at DTU Physics who was supervising the project and is co-author of the study.

It is nothing fundamentally new to measure magnetic fields induced in the human body, but it usually requires special equipment that is bulky and needs cryogenic cooling. As such, traditional methods are not suitable for measuring small, living tissue samples, let alone tissue from the human brain.

In this project, the team of scientists is taking advantage of tiny, deliberate flaws in synthetic diamond crystals. These flaws are called color centers or, technically, nitrogen-vacancy centers/NV centers. The term NV-centers derives from the fact that in the diamond, one carbon atom is replaced with a nitrogen atom and sits next to a vacancy, i.e., where no atom is present. This causes the centers to allow light absorption and — upon releasing energy — light emission.

“These NV color centers also have an effective unpaired electron with a spin, and if there is a magnetic field, the spin of the electron oscillates around that field. So, if the magnetic field increases or decreases, it will oscillate a bit faster or a bit slower, and we can measure these changes via the light emission of the NV color centers,” explains Huck.

The experimental setup is as follows: In a centimeter-scale chamber, a slice of brain tissue is placed on insulating layers of aluminum foil. The diamond is set in a hole at the bottom of the chamber, below the insulating layers. A green laser and a microwave antenna are then aimed at the color center of the diamond, and the light emission from the diamond is recorded. When the scientists stimulate the neurons in the tissue to fire simultaneously, they can measure changes in the brightness of light emission from the color centers.

Crucially, the laser light and the microwaves never reach the brain tissue — not an actual human brain in this case, but tissue from the brain of a mouse — the changes in the magnetic field are simply tracked using the NV color centers.

“When the neurons in the brain tissue sample fire, that will induce a magnetic field that then changes the light emission and the brightness of the diamond, which we record as an optical signal,” says Huck.

In their experiments, the scientists can distinguish signals from different types of nerve cells. They checked their measurements using a proven technique that touched the tissue and measured the electricity directly. They also show how they can artificially change the neuron activity in the tissue by using a drug that blocks specific channels in the nerve cells.

“Eventually, the idea is that when you have a patient, where you suspect some kind of neurodegenerative disease, you may use methods derived from our experiments to diagnose the precise condition,” concludes Huck. He stresses, however, that a lot of work is still needed for that to be the case:

“If we compare our technique to other methods in use today, which have been around for decades, they are still better than what we can do now. We are at an early stage, and much more work has to be done before this technique can be transferred and applied in a clinical environment. Research in NV centers and exploring their most suitable application areas is still at an early stage — this is a nascent field.”

Logical quantum processor based on reconfigurable atom arrays

by Dolev Bluvstein et al in Nature

A team of physicists, computer scientists and information machine specialists at Harvard University, working with colleagues from QuEra Computing Inc., the University of Maryland and MIT, has created a quantum computer with the largest-ever number of logical quantum bits. In their paper, the group describes how they built their computer and how well it performed when tested.

In the recent past, several big names in quantum computing have built quantum computers with more than 1,000 cubits — giving such computers more computing power than ever before. Unfortunately, all of them suffer from the massive amounts of error-correcting they require, a problem keeping such computers from going mainstream.

The makers of such systems are working on a way to reduce the problem, but thus far, a real solution has not been found. Other players have moved into the quantum computer research world using a different approach based on logical qubits rather than hardware-based qubits.

Surface code preparation and decoding data. a, Surface code stabilizers for the two independent d = 7 codes following state preparation.

Logical qubits are groupings of qubits connected via quantum entanglement. Instead of relying on redundant copies of information as an error-correcting protocol, logical qubit–based machines rely on the built-in redundancy of entanglement. For this new study, the research team built a quantum computer with 48 logical qubits, the most yet by any team.

The new computer was built by separating thousands of rubidium atoms in a vacuum chamber. The team then used lasers and magnets to chill the atoms to near absolute zero. They used other lasers to create qubits from 280 of the atoms and then entangle them and were able to create 48 logical qubits at one time. The logical qubits were made to interact using optical tweezers, avoiding the need for wires.

Preliminary testing of the machine showed that while executing calculations, their quantum computer had fewer errors than other larger machines based on physical qubits. The researchers suggest their machine represents yet another step toward the ultimate goal of creating a general-use quantum computer that can perform calculations and combinatorics that are not yet feasible using current computer technology.

Single-molecule electron spin resonance by means of atomic force microscopy

by Lisanne Sellies, Raffael Spachtholz, Sonja Bleher, Jakob Eckrich, Philipp Scheuerer, Jascha Repp in Nature

Physicists at the University of Regensburg have found a way to manipulate the quantum state of individual electrons using a microscope with atomic resolution.

We, and everything around us, consist of molecules. The molecules are so tiny that even a speck of dust contains countless numbers of them. It is now routinely possible to precisely image such molecules with an atomic force microscope, which works quite differently from an optical microscope: it is based on sensing tiny forces between a tip and the molecule under study.

Using this type of microscope, one can even image the internal structure of a molecule. Although one can watch the molecule this way, this does not imply knowing all its different properties. For instance, it is already very hard to determine which kind of atoms the molecule consists of.

Luckily, there are other tools around that can determine the composition of molecules. One of them is electron spin resonance, which is based on similar principles to an MRI scanner in medicine. In electron spin resonance, one usually needs, however, countless molecules to obtain a signal that is large enough to be detectable. With this approach, one cannot access the properties of every molecule, but only their average.

Set-up, triplet decay under resonant driving and ESR-AFM spectra.

Researchers at the University of Regensburg, led by Prof. Dr. Jascha Repp from the Institute of Experimental and Applied Physics at the UR, have now integrated electron spin resonance into atomic force microscopy. Importantly, the electron spin resonance is detected directly with the microscope’s tip, such that the signal comes from one individual molecule only. This way, they can characterize single molecules in a one-by-one fashion. This allows one to determine of which atoms the molecule they just imaged is composed.

“We could even discriminate molecules that do not differ in the type of atoms that they were composed of, but only in their isotopes, namely, in the composition of the atoms’ nuclei,” adds Lisanne Sellies, the first author of this study.

“Yet, we are even more intrigued by another possibility that electron spin resonance entails. This technique can be used to operate the spin-quantum state of the electrons present in the molecule,” says Prof. Dr. Repp.

Quantum computers store and process information that is encoded in a quantum state. To perform a calculation, quantum computers are required to manipulate a quantum state without losing the information by so-called decoherence. The researchers in Regensburg showed that with their new technique, they could operate the quantum state of the spin in a single molecule many times before the state decohered.

Since the microscopy technique allows the image of the individual neighborhood of the molecule, the newly developed technique could help understand how decoherence in a quantum computer depends on the atomic-scale environment and — eventually — how to avoid it.

Material characteristics governing in-plane phonon-polariton thermal conductance

by Jacob Minyard et al in Journal of Applied Physics

On the highway of heat transfer, thermal energy is moved by way of quantum particles called phonons. But at the nanoscale of today’s most cutting-edge semiconductors, those phonons don’t remove enough heat. That’s why Purdue University researchers are focused on opening a new nanoscale lane on the heat transfer highway by using hybrid quasiparticles called “polaritons.”

Thomas Beechem loves heat transfer. He talks about it loud and proud, like a preacher at a big tent revival. “We have several ways of describing energy,” said Beechem, associate professor of mechanical engineering. “When we talk about light, we describe it in terms of particles called ‘photons.’ Heat also carries energy in predictable ways, and we describe those waves of energy as ‘phonons.’ But sometimes, depending on the material, photons and phonons will come together and make something new called a ‘polariton.’ It carries energy in its own way, distinct from both photons or phonons.”

Like photons and phonons, polaritons aren’t physical particles you can see or capture. They are more like ways of describing energy exchange as if they were particles. Still fuzzy? How about another analogy. “Phonons are like internal combustion vehicles, and photons are like electric vehicles,” Beechem said. “Polaritons are a Toyota Prius. They are a hybrid of light and heat, and retain some of the properties of both. But they are their own special thing.”

Polaritons have been used in optical applications — everything from stained glass to home health tests. But their ability to move heat has largely been ignored, because their impact becomes significant only when the size of materials becomes very small. “We know that phonons do a majority of the work of transferring heat,” said Jacob Minyard, a Ph.D. student in Beechem’s lab.

“The effect of polaritons is only observable at the nanoscale. But we’ve never needed to address heat transfer at that level until now, because of semiconductors.” “Semiconductors have become so incredibly small and complex,” he continued. “People who design and build these chips are discovering that phonons don’t efficiently disperse heat at these very small scales. Our paper demonstrates that at those length scales, polaritons can contribute a larger share of thermal conductivity.”

“We in the heat transfer community have been very material-specific in describing the effect of polaritons,” said Beechem. “Someone will observe it in this material or at that interface. It’s all very disparate. Jacob’s paper has established that this isn’t some random thing. Polaritons begin to dominate the heat transfer on any surface thinner than 10 nanometers. That’s twice as big as the transistors on an iPhone 15.”

Now Beechem gets really fired up. “We’ve basically opened up a whole extra lane on the highway. And the smaller the scales get, the more important this extra lane becomes. As semiconductors continue to shrink, we need to think about designing the traffic flow to take advantage of both lanes: phonons and polaritons.”

Minyard’s paper just scratches the surface of how this can happen practically. The complexity of semiconductors means that there are many opportunities to capitalize upon polariton-friendly designs.

“There are many materials involved in chipmaking, from the silicon itself to the dielectrics and metals,” Minyard said. “The way forward for our research is to understand how these materials can be used to conduct heat more efficiently, recognizing that polaritons provide a whole new lane to move energy.”

Recognizing this, Beechem and Minyard want to show chip manufacturers how to incorporate these polariton-based nanoscale heat transfer principles right into the physical design of the chip — from the physical materials involved to the shape and thickness of the layers. While this work is theoretical now, physical experimentation is very much on the horizon — which is why Beechem and Minyard are happy to be at Purdue.

“The heat transfer community here at Purdue is so robust,” Beechem said. “We can literally go upstairs and talk to Xianfan Xu, who had one of the first experimental realizations of this effect. Then, we can walk over to Flex Lab and ask Xiulin Ruan about his pioneering work in phonon scattering. And we have the facilities here at Birck Nanotechnology Center to build nanoscale experiments and use one-of-a-kind measurement tools to confirm our findings. It’s really a researcher’s dream.”

The Wiedemann-Franz law in doped Mott insulators without quasiparticles

by Wen O. Wang, Jixun K. Ding, Yoni Schattner, Edwin W. Huang, Brian Moritz, Thomas P. Devereaux in Science

Long before researchers discovered the electron and its role in generating electrical current, they knew about electricity and were exploring its potential. One thing they learned early on was that metals were great conductors of both electricity and heat.

And in 1853, two scientists showed that those two admirable properties of metals were somehow related: At any given temperature, the ratio of electronic conductivity to thermal conductivity was roughly the same in any metal they tested. This so-called Wiedemann-Franz law has held ever since — except in quantum materials, where electrons stop behaving as individual particles and glom together into a sort of electron soup. Experimental measurements have indicated that the 170-year-old law breaks down in these quantum materials, and by quite a bit.

Now, a theoretical argument put forth by physicists at the Department of Energy’s SLAC National Accelerator Laboratory, Stanford University and the University of Illinois suggests that the law should, in fact, approximately hold for one type of quantum material — the copper oxide superconductors, or cuprates, which conduct electricity with no loss at relatively high temperatures.

In a paper, they propose that the Wiedemann-Franz law should still roughly hold if one considers only the electrons in cuprates. They suggest that other factors, such as vibrations in the material’s atomic latticework, must account for experimental results that make it look like the law does not apply. This surprising result is important to understanding unconventional superconductors and other quantum materials, said Wen Wang, lead author of the paper and a PhD student with the Stanford Institute for Materials and Energy Sciences (SIMES) at SLAC.

Inverse DQ, =T , D and . Parameters: U=t = 8 and t0=t = 0:25.

“The original law was developed for materials where electrons interact with each other weakly and behave like little balls that bounce off defects in the material’s lattice,” Wang said. “We wanted to test the law theoretically in systems where neither of these things was true.”

Superconducting materials, which carry electric current without resistance, were discovered in 1911. But they operated at such extremely low temperatures that their usefulness was quite limited. That changed in 1986, when the first family of so-called high-temperature or unconventional superconductors — the cuprates — was discovered. Although cuprates still require extremely cold conditions to work their magic, their discovery raised hopes that superconductors could someday work at much closer to room temperature — making revolutionary technologies like no-loss power lines possible.

After nearly four decades of research, that goal is still elusive, although a lot of progress has been made in understanding the conditions in which superconducting states flip in and out of existence. Theoretical studies, performed with the help of powerful supercomputers, have been essential for interpreting the results of experiments on these materials and for understanding and predicting phenomena that are out of experimental reach.

Trotter error analysis of L for (A)-© U=t = 6; t0=t = 0:25, and (D)-(F) U=t = 10; t0=t = 0. Dashed lines are for d = 0:05=t and solid lines are for d = 0:025=t.

For this study, the SIMES team ran simulations based on what’s known as the Hubbard model, which has become an essential tool for simulating and describing systems where electrons stop acting independently and join forces to produce unexpected phenomena. The results show that when you only take electron transport into account, the ratio of electronic conductivity to thermal conductivity approaches what the Wiedemann-Franz law predicts, Wang said. “So, the discrepancies that have been seen in experiments should be coming from other things like phonons, or lattice vibrations, that are not in the Hubbard model,” she said.

SIMES staff scientist and paper co-author Brian Moritz said that although the study did not investigate how vibrations cause the discrepancies, “somehow the system still knows that there is this correspondence between charge and heat transport amongst the electrons. That was the most surprising result.” From here, he added, “maybe we can peel the onion to understand a little bit more.”

Discovery of ultrafast spontaneous spin switching in an antiferromagnet by femtosecond noise correlation spectroscopy

by M. A. Weiss, A. Herbst, J. Schlegel, T. Dannegger, M. Evers, A. Donges, M. Nakajima, A. Leitenstorfer, S. T. B. Goennenwein, U. Nowak, T. Kurihara in Nature Communications

Noise on the radio when reception is poor is a typical example of how fluctuations mask a physical signal. In fact, such interference or noise occurs in every physical measurement in addition to the actual signal. “Even in the loneliest place in the universe, where there should be nothing at all, there are still fluctuations of the electromagnetic field,” says physicist Ulrich Nowak. In the Collaborative Research Centre (CRC) 1432 “Fluctuations and Nonlinearities in Classical and Quantum Matter beyond Equilibrium” at the University of Konstanz, researchers do not see this omnipresent noise as a disturbing factor that needs to be eliminated as far as possible, but as a source of information that tells us something about the signal.

This approach has now proved successful when investigating antiferromagnets. Antiferromagnets are magnetic materials in which the magnetizations of several sub-lattices cancel out each other. Nevertheless, antiferromagnetic insulators are considered promising for energy-efficient components in the field of information technology. As they have hardly any magnetic fields on the outside, they are very difficult to characterize physically. Yet, antiferromagnets are surrounded by magnetic fluctuations, which can tell us a lot about this weakly magnetic material.

In this spirit, the groups of the two materials scientists Ulrich Nowak and Sebastian Gönnenwein analysed the fluctuations of antiferromagnetic materials in the context of the CRC. The decisive factor in their theoretical as well as experimental study was the specific frequency range. “We measure very fast fluctuations and have developed a method with which fluctuations can still be detected on the ultrashort time scale of femtoseconds,” says experimental physicist Sebastian Gönnenwein. A femtosecond is one millionth of a billionth of a second.

Schematic illustration of the experimental setup and spin system.

On slower time scales, one could use electronics that are fast enough to measure these fluctuations. On ultrafast time scales, this no longer works, which is why a new experimental approach had to be developed. It is based on an idea from the research group of Alfred Leitenstorfer, who is also a member of the Collaborative Research Centre. Employing laser technology, the researchers use pulse sequences or pulse pairs in order to obtain information about fluctuations. Initially, this measurement approach was developed to investigate quantum fluctuations, and has now been extended to fluctuations in magnetic systems. Takayuki Kurihara from the University of Tokyo played a key role in this development as the third cooperation partner. He was a member of the Leitenstorfer research group and the Zukunftskolleg at the University of Konstanz from 2018 to 2020.

In the experiment, two ultrashort light pulses are transmitted through the magnet with a time delay, testing the magnetic properties during the transit time of each pulse, respectively. The light pulses are then checked for similarity using sophisticated electronics. The first pulse serves as a reference, the second contains information about how much the antiferromagnet has changed in the time between the first and second pulse. Different measurement results at the two points of time confirm the fluctuations. Ulrich Nowak’s research group also modelled the experiment in elaborate computer simulations in order to better understand its results.

One unexpected result was the discovery of what is known as telegraph noise on ultrashort time scales. This means that there is not only unsorted noise, but also fluctuations in which the system switches back and forth between two well-defined states.Such fast, purely random switching has never been observed before and could be interesting for applications such as random number generators. In any case, the new methodological possibilities for analyzing fluctuations on ultrashort time scales offer great potential for further discoveries in the field of functional materials.

Subscribe to Paradigm!

Medium, Twitter, Telegram, Telegram Chat, LinkedIn, and Reddit.

Main sources

Research articles

Advanced Quantum Technologies

PRX Quantum

Science Daily

SciTechDaily

Quantum News

Nature

--

--