QT/ Future sparkles for diamond-based quantum technology

Paradigm
Paradigm
Published in
23 min readMay 21, 2021

Quantum news biweekly vol.4, 7th May — 21st May

TL;DR

  • Two research breakthroughs are poised to accelerate the development of synthetic diamond-based quantum technology, improve scalability, and dramatically reduce manufacturing costs.
  • New experiments provide evidence for a decades-old theory that, in the quantum regime, an electron behaves as if it is made of two particles: one particle that carries its negative charge and the other that gives it a magnet-like property called spin. The team detected evidence for this theory in materials called quantum spin liquids.
  • Researchers have designed a remarkably fast engine that taps into a new kind of fuel — information. This engine converts the random jiggling of a microscopic particle into stored energy. It could lead to significant advances in the speed and cost of computers and bio-nanotechnologies.
  • A novel technique for studying vortices in quantum fluids has been developed by physicists. Turbulence in quantum systems, for example in superfluid helium 4, takes place on microscopic scales, and so far scientists have not had tools with sufficient precision to probe eddies this small. But now the team, working at temperature of a few thousandths of a degree above absolute zero, has harnessed nanoscience to allow the detection of single quantum vortices.
  • The University of Kent’s School of Physical Sciences, in collaboration with the Science and Technology Facilities Council (STFC) and the Universities of Cardiff, Durham and Leeds, have developed an algorithm to train computers to analyse signals from subatomic particles embedded in advanced electronic materials.
  • A black hole permanently scrambles information that can’t be recovered with any quantum machine learning algorithm, shedding new light on the classic Hayden-Preskill thought experiment.
  • The promise of a quantum internet depends on the complexities of harnessing light to transmit quantum information over fiber optic networks. A potential step forward was reported today by researchers in Sweden who developed integrated chips that can generate light particles on demand and without the need for extreme refrigeration.
  • Like conductors of a spooky symphony, researchers at the National Institute of Standards and Technology (NIST) have “entangled” two small mechanical drums and precisely measured their linked quantum properties. Entangled pairs like this might someday perform computations and transmit data in large-scale quantum networks.
  • And more!

Quantum Computing Market

According to the recent market research report ‘Quantum Computing Market with COVID-19 impact by Offering (Systems and Services), Deployment (On Premises and Cloud Based), Application, Technology, End-use Industry and Region — Global Forecast to 2026’, published by MarketsandMarkets, the Quantum Computing market is expected to grow from USD 472 million in 2021 to USD 1,765 million by 2026, at a CAGR of 30.2%. The early adoption of quantum computing in the banking and finance sector is expected to fuel the growth of the market globally. Other key factors contributing to the growth of the quantum computing market include rising investments by governments of different countries to carry out research and development activities related to quantum computing technology. Several companies are focusing on the adoption of QCaaS post-COVID-19. This, in turn, is expected to contribute to the growth of the quantum computing market. However, stability and error correction issues are expected to restrain the growth of the market.

According to ‘Quantum Computing Market Research Report: By Offering, Deployment Type, Application, Technology, Industry — Industry Share, Growth, Drivers, Trends and Demand Forecast to 2030’ report, the quantum computing market is projected to reach $64,988 million by 2030. Machine learning (ML) is expected to progress at the highest CAGR, during the forecast period, among all application categories, owing to the fact that quantum computing is being integrated in ML for improving the latter’s use case.

By 2030, Europe and North America are expected to account for more than 78.0% in the quantum computing market, as Canada, the U.S., the U.K., Germany, and Russia are witnessing heavy investments in the field.

Quantum Physics

Latest Researches

Nanofabrication of high Q, transferable diamond resonators

by Blake Regan, Aleksandra Trycz, Johannes E. Fröch, Otto Cranwell Schaeper, Sejeong Kim, Igor Aharonovich.

Marilyn Monroe famously sang that diamonds are a girl’s best friend, but they are also very popular with quantum scientists — with two new research breakthroughs poised to accelerate the development of synthetic diamond-based quantum technology, improve scalability, and dramatically reduce manufacturing costs.

While silicon is traditionally used for computer and mobile phone hardware, diamond has unique properties that make it particularly useful as a base for emerging quantum technologies such as quantum supercomputers, secure communications and sensors.

However there are two key problems; cost, and difficulty in fabricating the single crystal diamond layer, which is smaller than one millionth of a metre.

A research team from the ARC Centre of Excellence for Transformative Meta-Optics at the University of Technology Sydney (UTS), led by Professor Igor Aharonovich, has just published two research papers, in Nanoscale and Advanced Quantum Technologies, that address these challenges.

“For diamond to be used in quantum applications, we need to precisely engineer ‘optical defects’ in the diamond devices — cavities and waveguides — to control, manipulate and readout information in the form of qubits — the quantum version of classical computer bits,” said Professor Aharonovich.

“It’s akin to cutting holes or carving gullies in a super thin sheet of diamond, to ensure light travels and bounces in the desired direction,” he said.

To overcome the “etching” challenge, the researchers developed a new hard masking method, which uses a thin metallic tungsten layer to pattern the diamond nanostructure, enabling the creation of one-dimensional photonic crystal cavities.

“The use of tungsten as a hard mask addresses several drawbacks of diamond fabrication. It acts as a uniform restraining conductive layer to improve the viability of electron beam lithography at nanoscale resolution,” said lead author of paper in Nanoscale, UTS PhD candidate Blake Regan.

To the best of our knowledge, we offer the first evidence of the growth of a single crystal diamond structure from a polycrystalline material using a bottom up approach — like growing flowers from seed.

“It also allows the post-fabrication transfer of diamond devices onto the substrate of choice under ambient conditions. And the process can be further automated, to create modular components for diamond-based quantum photonic circuitry,” he said.

The tungsten layer is 30nm wide — around 10,000 times thinner than a human hair — however it enabled a diamond etch of over 300nm, a record selectivity for diamond processing.

A further advantage is that removal of the tungsten mask does not require the use of hydrofluoric acid — one of the most dangerous acids currently in use — so this also significantly improves the safety and accessibility of the diamond nanofabrication process.

To address the issue of cost, and improve scalability, the team further developed an innovative step to grow single crystal diamond photonic structures with embedded quantum defects from a polycrystalline substrate.

“Our process relies on lower cost large polycrystalline diamond, which is available as large wafers, unlike the traditionally used high quality single crystal diamond, which is limited to a few mm2” said UTS PhD candidate Milad Nonahal, lead author of the study in Advanced Quantum Technologies.

“To the best of our knowledge, we offer the first evidence of the growth of a single crystal diamond structure from a polycrystalline material using a bottom up approach — like growing flowers from seed,” he added.

“Our method eliminates the need for expensive diamond materials and the use of ion implantation, which is key to accelerating the commercialisation of diamond quantum hardware” said UTS Dr Mehran Kianinia, a senior author on the second study.

Oscillations of the thermal conductivity in the spin-liquid state of α-RuCl3

by Peter Czajka, Tong Gao, Max Hirschberger, Paula Lampen-Kelley, Arnab Banerjee, Jiaqiang Yan, David G. Mandrus, Stephen E. Nagler, N. P. Ong in Nature Physics

A new discovery led by Princeton University could upend our understanding of how electrons behave under extreme conditions in quantum materials. The finding provides experimental evidence that this familiar building block of matter behaves as if it is made of two particles: one particle that gives the electron its negative charge and another that supplies its magnet-like property, known as spin.

“We think this is the first hard evidence of spin-charge separation,” said Nai Phuan Ong, Princeton’s Eugene Higgins Professor of Physics and senior author on the paper.

The experimental results fulfill a prediction made decades ago to explain one of the most mind-bending states of matter, the quantum spin liquid. In all materials, the spin of an electron can point either up or down. In the familiar magnet, all of the spins uniformly point in one direction throughout the sample when the temperature drops below a critical temperature.

However, in spin liquid materials, the spins are unable to establish a uniform pattern even when cooled very close to absolute zero. Instead, the spins are constantly changing in a tightly coordinated, entangled choreography. The result is one of the most entangled quantum states ever conceived, a state of great interest to researchers in the growing field of quantum computing.

To describe this behavior mathematically, Nobel prize-winning Princeton physicist Philip Anderson (1923–2020), who first predicted the existence of spin liquids in 1973, proposed an explanation: in the quantum regime an electron may be regarded as composed of two particles, one bearing the electron’s negative charge and the other containing its spin. Anderson called the spin-containing particle a spinon.

In this new study, the team searched for signs of the spinon in a spin liquid composed of ruthenium and chlorine atoms. At temperatures a fraction of a Kelvin above absolute zero (or roughly -452 degrees Fahrenheit) and in the presence of a high magnetic field, ruthenium chloride crystals enter the spin liquid state.

Graduate student Peter Czajka and Tong Gao, Ph.D. 2020, connected three highly sensitive thermometers to the crystal sitting in a bath maintained at temperatures close to absolute zero degrees Kelvin. They then applied the magnetic field and a small amount of heat to one crystal edge to measure its thermal conductivity, a quantity that expresses how well it conducts a heat current. If spinons were present, they should appear as an oscillating pattern in a graph of the thermal conductivity versus magnetic field.

The oscillating signal they were searching for was tiny — just a few hundredths of a degree change — so the measurements demanded an extraordinarily precise control of the sample temperature as well as careful calibrations of the thermometers in the strong magnetic field.

The team used the purest crystals available, ones grown at the U.S. Department of Energy’s Oak Ridge National Laboratory (ORNL) under the leadership of David Mandrus, materials science professor at the University of Tennessee-Knoxville, and Stephen Nagler, corporate research fellow in ORNL’s Neutron Scattering Division. The ORNL team has extensively studied the quantum spin liquid properties of ruthenium chloride.

In a series of experiments conducted over nearly three years, Czajka and Gao detected temperature oscillations consistent with spinons with increasingly higher resolution, providing evidence that the electron is composed of two particles consistent with Anderson’s prediction.

“People have been searching for this signature for four decades,” Ong said, “If this finding and the spinon interpretation are validated, it would significantly advance the field of quantum spin liquids.”

Czajka and Gao spent last summer confirming the experiments while under COVID restrictions that required them to wear masks and maintain social distancing.

“From the purely experimental side,” Czajka said, “it was exciting to see results that in effect break the rules that you learn in elementary physics classes.”

Maximizing power and velocity of an information engine

by Tushar K. Saha, Joseph N. E. Lucero, Jannik Ehrich, David A. Sivak, John Bechhoefer in Proceedings of the National Academy of Sciences

Simon Fraser University researchers have designed a remarkably fast engine that taps into a new kind of fuel — information.

The development of this engine, which converts the random jiggling of a microscopic particle into stored energy, is outlined in research could lead to significant advances in the speed and cost of computers and bio-nanotechnologies.

SFU physics professor and senior author John Bechhoefer says researchers’ understanding of how to rapidly and efficiently convert information into “work” may inform the design and creation of real-world information engines.

“We wanted to find out how fast an information engine can go and how much energy it can extract, so we made one,” says Bechhoefer, whose experimental group collaborated with theorists led by SFU physics professor David Sivak.

Engines of this type were first proposed over 150 years ago but actually making them has only recently become possible.

“By systematically studying this engine, and choosing the right system characteristics, we have pushed its capabilities over ten times farther than other similar implementations, thus making it the current best-in-class,” says Sivak.

The information engine designed by SFU researchers consists of a microscopic particle immersed in water and attached to a spring which, itself, is fixed to a movable stage. Researchers then observe the particle bouncing up and down due to thermal motion.

“When we see an upward bounce, we move the stage up in response,” explains lead author and PhD student Tushar Saha. “When we see a downward bounce, we wait. This ends up lifting the entire system using only information about the particle’s position.”

Repeating this procedure, they raise the particle “a great height, and thus store a significant amount of gravitational energy,” without having to directly pull on the particle.

Saha further explains that, “in the lab, we implement this engine with an instrument known as an optical trap, which uses a laser to create a force on the particle that mimics that of the spring and stage.”

Joseph Lucero, a Master of Science student adds, “in our theoretical analysis, we find an interesting trade-off between the particle mass and the average time for the particle to bounce up. While heavier particles can store more gravitational energy, they generally also take longer to move up.”

“Guided by this insight, we picked the particle mass and other engine properties to maximize how fast the engine extracts energy, outperforming previous designs and achieving power comparable to molecular machinery in living cells, and speeds comparable to fast-swimming bacteria,” says postdoctoral fellow Jannik Ehrich.

Nanoscale real-time detection of quantum vortices at millikelvin temperatures

by A. Guthrie, S. Kafanov, M. T. Noble, Yu. A. Pashkin, G. R. Pickett, V. Tsepelin, A. A. Dorofeev, V. A. Krupenin, D. E. Presnov in Nature Communications,

A novel technique for studying vortices in quantum fluids has been developed by Lancaster physicists.

Andrew Guthrie, Sergey Kafanov, Theo Noble, Yuri Pashkin, George Pickett and Viktor Tsepelin, in collaboration with scientists from Moscow State University, used tiny mechanical resonators to detect individual quantum vortices in superfluid helium.

This research into quantum turbulence is simpler than turbulence in the real world, which is observed in everyday phenomena such as surf, fast flowing rivers, billowing storm clouds, or chimney smoke. Despite the fact it is so commonplace and is found at every level, from the galaxies to the subatomic, it is still not fully understood.

Physicists know the fundamental Navier-Stokes Equations which govern the flow of fluids such as air and water, but despite centuries of trying, the mathematical equations still cannot be solved.

Quantum turbulence may provide the clues to an answer.

Turbulence in quantum fluids is much simpler than its “messy” classical counterpart, and being made up of identical singly-quantised vortices, can be thought of as providing an “atomic theory” of the phenomenon.

Unhelpfully, turbulence in quantum systems, for example in superfluid helium 4, takes place on microscopic scales, and so far scientists have not had tools with sufficient precision to probe eddies this small.

But now the Lancaster team, working at temperature of a few thousandths of a degree above absolute zero, has harnessed nanoscience to allow the detection of single quantum vortices (with core sizes on a par with atomic diameters) by using a nanoscale “guitar string “in the superfluid.

How the team does it is to trap a single vortex along the length of the “string” (a bar of around 100 nanometres across). The resonant frequency of the bar changes when a vortex is trapped, and thus the capture and release rate of vortices can be followed, opening a window into the turbulent structure.

Dr Sergey Kafanov who initiated this research said: “The devices developed have many other uses, one of which is to ping the end of a partially trapped vortex to study the nanoscale oscillations of the vortex core. Hopefully the studies will add to our insight into turbulence and may provide clues on how to solve these stubborn equations.”

Schematic of the experimental setup. A tuning fork generates quantum turbulence, whilst a 70-μm-long nanomechanical beam, suspended 1 μm above the substrate, acts as the detector. The beam and fork are driven by vector network analysers or signal generators through several stages of attenuation at various temperatures. The beam and fork signals are amplified at room temperature by an 80-dB amplifier and an I/V converter.

Machine learning approach to muon spectroscopy analysis

by T Tula, G Möller, J Quintanilla, S R Giblin, A D Hillier, E E McCabe, S Ramos, D S Barker, S Gibson in Journal of Physics: Condensed Matter

The University of Kent’s School of Physical Sciences, in collaboration with the Science and Technology Facilities Council (STFC) and the Universities of Cardiff, Durham and Leeds, have developed an algorithm to train computers to analyse signals from subatomic particles embedded in advanced electronic materials.

The particles, called muons, are produced in large particle accelerators and are implanted inside samples of materials in order to investigate their magnetic properties. Muons are uniquely useful as they couple magnetically to individual atoms inside the material and then emit a signal detectable by researchers to obtain information on that magnetism.

This ability to examine magnetism on the atomic scale makes muon-based measurements one of the most powerful probes of magnetism in electronic materials, including “quantum materials” such as superconductors and other exotic forms of matter.

As it is not possible to deduce what is going on in the material by simple examination of the signal, researchers normally compare their data to generic models. In contrast, the present team adapted a data-science technique called Principal Component Analysis (PCA), frequently employed in Face Recognition.

The PCA technique involves a computer being fed many related but distinct images and then running an algorithm identifying a small number “archetypal” images that can be combined to reproduce, with great accuracy, any of the original images. An algorithm trained in this way can then go on to perform tasks such as recognising whether a new image matches a previously-seen one.

Researchers adapted the PCA technique to analyse the signals sent out by muons embedded in complex materials, training the algorithm for a variety of quantum materials using experimental data obtained at the ISIS Neutron and Muon source of the STFC Rutherford Appleton Laboratory.

The results showed the new technique is equally as proficient as the standard method at detecting phase transitions and in some cases could detect transitions beyond the capabilities of standard analyses.

Dr Jorge Quintanilla, Senior Lecturer in Condensed Matter Theory at Kent and leader of the Physics of Quantum Materials research group said: ‘Our research results are exceptional, as this was achieved by an algorithm that knew nothing about the physics of the materials being investigated. This suggests that the new approach might have very broad application and, as such, we have made our algorithms available for use by the worldwide research community.’

Barren Plateaus Preclude Learning Scramblers

by Zoë Holmes, Andrew Arrasmith, Bin Yan, Patrick J. Coles, Andreas Albrecht, Andrew T. Sornborger in Physical Review Letters

A new theorem from the field of quantum machine learning has poked a major hole in the accepted understanding about information scrambling.

“Our theorem implies that we are not going to be able to use quantum machine learning to learn typical random or chaotic processes, such as black holes. In this sense, it places a fundamental limit on the learnability of unknown processes,” said Zoe Holmes, a post-doc at Los Alamos National Laboratory and coauthor of the paper.

“Thankfully, because most physically interesting processes are sufficiently simple or structured so that they do not resemble a random process, the results don’t condemn quantum machine learning, but rather highlight the importance of understanding its limits,” Holmes said.

In the classic Hayden-Preskill thought experiment, a fictitious Alice tosses information such as a book into a black hole that scrambles the text. Her companion, Bob, can still retrieve it using entanglement, a unique feature of quantum physics. However, the new work proves that fundamental constraints on Bob’s ability to learn the particulars of a given black hole’s physics means that reconstructing the information in the book is going to be very difficult or even impossible.

“Any information run through an information scrambler such as a black hole will reach a point where the machine learning algorithm stalls out on a barren plateau and thus becomes untrainable. That means the algorithm can’t learn scrambling processes,” said Andrew Sornborger a computer scientist at Los Alamos and coauthor of the paper. Sornborger is Director of Quantum Science Center at Los Alamos and leader of the Center’s algorithms and simulation thrust. The Center is a multi-institutional collaboration led by Oak Ridge National Laboratory.

Barren plateaus are regions in the mathematical space of optimization algorithms where the ability to solve the problem becomes exponentially harder as the size of the system being studied increases. This phenomenon, which severely limits the trainability of large scale quantum neural networks, was described in a recent paper by a related Los Alamos team.

“Recent work has identified the potential for quantum machine learning to be a formidable tool in our attempts to understand complex systems,” said Andreas Albrecht, a co-author of the research. Albrecht is Director of the Center for Quantum Mathematics and Physics (QMAP) and Distinguished Professor, Department of Physics and Astronomy, at UC Davis. “Our work points out fundamental considerations that limit the capabilities of this tool.”

In the Hayden-Preskill thought experiment, Alice attempts to destroy a secret, encoded in a quantum state, by throwing it into nature’s fastest scrambler, a black hole. Bob and Alice are the fictitious quantum dynamic duo typically used by physicists to represent agents in a thought experiment.

“You might think that this would make Alice’s secret pretty safe,” Holmes said, “but Hayden and Preskill argued that if Bob knows the unitary dynamics implemented by the black hole, and share a maximally entangled state with the black hole, it is possible to decode Alice’s secret by collecting a few additional photons emitted from the black hole. But this prompts the question, how could Bob learn the dynamics implemented by the black hole? Well, not by using quantum machine learning, according to our findings.”

A key piece of the new theorem developed by Holmes and her coauthors assumes no prior knowledge of the quantum scrambler, a situation unlikely to occur in real-world science.

“Our work draws attention to the tremendous leverage even small amounts of prior information may play in our ability to extract information from complex systems and potentially reduce the power of our theorem,” Albrecht said. “Our ability to do this can vary greatly among different situations (as we scan from theoretical consideration of black holes to concrete situations controlled by humans here on earth). Future research is likely to turn up interesting examples, both of situations where our theorem remains fully in force, and others where it can be evaded.

Learning a scrambling unitary. Panel (a) shows the setup of the classic Hayden-Preskill thought experiment where someone attempts to retrieve information (shown as a book) thrown into a black hole (a scrambler). If the scrambling unitary U is known, then information can be retrieved. Panel (b) shows the process of attempting to learn U. This requires a time that is exponential in the number of quantum degrees of freedom (qubits) due to an exponentially vanishing cost gradient, see Letter below. This precludes the information retrieval shown in (a).

Deterministic Integration of hBN Emitter in Silicon Nitride Photonic Waveguide

by Ali W. Elshaari, Anas Skalli, Samuel Gyger, Martin Nurizzo, Lucas Schweickert, Iman Esmaeil Zadeh, Mikael Svedendahl, Stephan Steinhauer, Val Zwiller in Advanced Quantum Technologies

The promise of a quantum internet depends on the complexities of harnessing light to transmit quantum information over fiber optic networks. A potential step forward was reported today by researchers in Sweden who developed integrated chips that can generate light particles on demand and without the need for extreme refrigeration.

Quantum computing today relies on states of matter, that is, electrons which carry qubits of information to perform multiple calculations simultaneously, in a fraction of the time it takes with classical computing.

The co-author of the research, Val Zwiller, Professor at KTH Royal Institute of Technology, says that in order to integrate quantum computing seamlessly with fiber-optic networks — which are used by the internet today — a more promising approach would be to harness optical photons.

“The photonic approach offers a natural link between communication and computation,” he says. “That’s important, since the end goal is to transmit the processed quantum information using light.”

But in order for photons to deliver qubits on-demand in quantum systems, they need to be emitted in a deterministic, rather than probabilistic, fashion. This can be accomplished at extremely low temperatures in artificial atoms, but today the research group at KTH reported a way to make it work in optical integrated circuits — at room temperature.

The new method enables photon emitters to be precisely positioned in integrated optical circuits that resemble copper wires for electricity, except that they carry light instead, says co-author of the research, Ali Elshaari, Associate Professor at KTH Royal Institute of Technology.

The researchers harnessed the single-photon-emitting properties of hexagonal boron nitride (hBN), a layered material. hBN is a compound commonly used is used ceramics, alloys, resins, plastics and rubbers to give them self-lubricating properties. They integrated the material with silicon nitride waveguides to direct the emitted photons.

Quantum circuits with light are either operated at cryogenic temperatures — plus 4 Kelvin above absolute zero — using atom-like single photon sources, or at room temperature using random single photon sources, Elshaari says. By contrast, the technique developed at KTH enables optical circuits with on-demand emission of light particles at room temperature.

“In existing optical circuits operating at room temperature, you never know when the single photon is generated unless you do a heralding measurement,” Elshaari says. “We realized a deterministic process that precisely positions light-particles emitters operating at room temperature in an integrated photonic circuit.”

The researchers reported coupling of hBN single photon emitter to silicon nitride waveguides, and they developed a method to image the quantum emitters. Then in a hybrid approach, the team built the photonic circuits with respect to the quantum sources locations using a series of steps involving electron beam lithography and etching, while still preserving the high quality nature of the quantum light.

The achievement opens a path to hybrid integration, that is, incorporating atom-like single-photon emitters into photonic platforms that cannot emit light efficiently on demand.

a) Artistic representation of a single-photon emitter in an hBN flake deterministically integrated in a SiN photonic waveguide. b) Photoluminescence of hBN emitters under excitation of 532 nm CW laser. The hBN flakes were dispensed on themal SiO2 (orange) and SiN (green), then were annealed for 30 min at a temperature of 1100 ° C. c) Emission spectrum of a single hBN quantum emitter at room temperature , and 200 mK at different times, excited with 532 nm CW laser. d) Time trace of the center emission wavelength for the same emitter in (C )measured at 200 mK. e) Second order correlation function of the emitter in (C ) at room temperature, no filtering was used for the emitter except for a long pass filter at 550 nm for laser rejection. The zero delay value with no filtering is (g(2)(0)=0.3±0.06. f) Second order correlation measurement for the same emitter at 200 mK, the zero delay value with no filtering is g(2)(0)=0.33±0.03.

Direct observation of deterministic macroscopic entanglement

by Shlomi Kotler, Gabriel A. Peterson, Ezad Shojaee, Florent Lecocq, Katarina Cicak, Alex Kwiatkowski, Shawn Geller, Scott Glancy, Emanuel Knill, Raymond W. Simmonds, José Aumentado, John D. Teufel in Science

Like conductors of a spooky symphony, researchers at the National Institute of Standards and Technology (NIST) have “entangled” two small mechanical drums and precisely measured their linked quantum properties. Entangled pairs like this might someday perform computations and transmit data in large-scale quantum networks.

The NIST team used microwave pulses to entice the two tiny aluminum drums into a quantum version of the Lindy Hop, with one partner bopping in a cool and calm pattern while the other was jiggling a bit more. Researchers analyzed radar-like signals to verify that the two drums’ steps formed an entangled pattern — a duet that would be impossible in the everyday classical world.

What’s new is not so much the dance itself but the researchers’ ability to measure the drumbeats, rising and falling by just one-quadrillionth of a meter, and verify their fragile entanglement by detecting subtle statistical relationships between their motions.

“If you analyze the position and momentum data for the two drums independently, they each simply look hot,” NIST physicist John Teufel said. “But looking at them together, we can see that what looks like random motion of one drum is highly correlated with the other, in a way that is only possible through quantum entanglement.”

Quantum mechanics was originally conceived as the rulebook for light and matter at atomic scales. However, in recent years researchers have shown that the same rules can apply to increasingly larger objects such as the drums. Their back-and-forth motion makes them a type of system known as a mechanical oscillator. Such systems were entangled for the first time at NIST about a decade ago, and in that case the mechanical elements were single atoms.

Since then, Teufel’s research group has been demonstrating quantum control of drumlike aluminum membranes suspended above sapphire mats. By quantum standards, the NIST drums are massive, 20 micrometers wide by 14 micrometers long and 100 nanometers thick. They each weigh about 70 picograms, which corresponds to about 1 trillion atoms.

Entangling massive objects is difficult because they interact strongly with the environment, which can destroy delicate quantum states. Teufel’s group developed new methods to control and measure the motion of two drums simultaneously. The researchers adapted a technique first demonstrated in 2011 for cooling a single drum by switching from steady to pulsed microwave signals to separately optimize the steps of cooling, entangling and measuring the states. To rigorously analyze the entanglement, experimentalists also worked more closely with theorists, an increasingly important alliance in the global effort to build quantum networks.

The NIST drum set is connected to an electrical circuit and encased in a cryogenically chilled cavity. When a microwave pulse is applied, the electrical system interacts with and controls the activities of the drums, which can sustain quantum states like entanglement for approximately a millisecond, a long time in the quantum world.

For the experiments, researchers applied two simultaneous microwave pulses to cool the drums, two more simultaneous pulses to entangle the drums, and two final pulses to amplify and record the signals representing the quantum states of the two drums. The states are encoded in a reflected microwave field, similar to radar. Researchers compared the reflections to the original microwave pulse to determine the position and momentum of each drum.

To cool the drums, researchers applied pulses at a frequency below the cavity’s natural vibrations. As in the 2011 experiment, the drumbeats converted applied photons to the cavity’s higher frequency. These photons leaked out of the cavity as it filled up. Each departing photon took with it one mechanical unit of energy — one phonon, or one quantum — from drum motion. This got rid of most of the heat-related drum motion.

To create entanglement, researchers applied microwave pulses in between the frequencies of the two drums, higher than drum 1 and lower than drum 2. These pulses entangled drum 1 phonons with the cavity’s photons, generating correlated photon-phonon pairs. The pulses also cooled drum 2 further, as photons leaving the cavity were replaced with phonons. What was left was mostly pairs of entangled phonons shared between the two drums.

To entangle the phonon pairs, the duration of the pulses was crucial. Researchers discovered that these microwave pulses needed to last longer than 4 microseconds, ideally 16.8 microseconds, to strongly entangle the phonons. During this time period the entanglement became stronger and the motion of each drum increased because they were moving in unison, a kind of sympathetic reinforcement, Teufel said.

Researchers looked for patterns in the returned signals, or radar data. In the classical world the results would be random. Plotting the results on a graph revealed unusual patterns suggesting the drums were entangled. To be certain, the researchers ran the experiment 10,000 times and applied a statistical test to calculate the correlations between various sets of results, such as the positions of the two drums.

“Roughly speaking, we measured how correlated two variables are — for example, if you measured the position of one drum, how well could you predict the position of the other drum,” Teufel said. “If they have no correlations and they are both perfectly cold, you could only guess the average position of the other drum within an uncertainly of half a quantum of motion. When they are entangled, we can do better, with less uncertainty. Entanglement is the only way this is possible.”

“To verify that entanglement is present, we do a statistical test called an ‘entanglement witness,’’’ NIST theorist Scott Glancy said. “We observe correlations between the drums’ positions and momentums, and if those correlations are stronger than can be produced by classical physics, we know the drums must have been entangled. The radar signals measure position and momentum simultaneously, but the Heisenberg uncertainty principle says that this can’t be done with perfect accuracy. Therefore, we pay a cost of extra randomness in our measurements. We manage that uncertainty by collecting a large data set and correcting for the uncertainty during our statistical analysis.”

Highly entangled, massive quantum systems like this might serve as long-lived nodes of quantum networks. The high-efficiency radar measurements used in this work could be helpful in applications such as quantum teleportation — data transfer without a physical link — or swapping entanglement between nodes of a quantum network, because these applications require decisions to be made based on measurements of entanglement outcomes. Entangled systems could also be used in fundamental tests of quantum mechanics and force sensing beyond standard quantum limits.

MISC

Subscribe to Paradigm!

Medium. Twitter. Telegram. Telegram Chat. Reddit. LinkedIn.

Main sources

Research articles

Advanced Quantum Technologies

PRX Quantum

Science Daily

SciTechDaily

Quantum News

Nature

--

--