Topological Quantum Computing

WOMANIUM Global Quantum Media Initiative — Winner of Global Quantum Media Project

FEROZ AHMAD فيروز أحمد
Quantum Engineering
19 min readAug 7, 2023

--

Table of Contents

01) Vortex Theory of Atoms
02) Nuanced Atomization through Vortex Rings
03) Peter Tait’s Knot Theory
04) Fundamentals of Knot Theory
05) The Significance of Knot Invariants
06) Exploration and Formulation of Kauffman Invariant
07) The Kauffman Invariant
08) Topological Equivalence
09) Complexities of Knot Invariants
10) Topological Quantum Field Theory
11) Topological Equivalence and Amplitudes in TQFT
12) Knot Invariants and Ed Witten’s Contributions
13) Proposed TQFT Computer
14) Flashback of Prehistory of Topological Quantum Computing
15) Topological Phases of Matter
16) Multiple Quasiparticles
17) Preparing Quasiparticle “Ket” and “Bra” States
18) Topological Phases of Matter Rules
19)Statistics and Non-Abelian Phenomena
20) Non-Abelian Statistics
21) Implications for Quantum Computation
22) Quantized Hall Effect
23) Exemplar Quantum Hall Sample
24) Fractional Quantum Hall Fluid
25) Topological Quantum Computation
26) Initialization and Measurement of States
27) Advantages

Introduction

Topological Quantum Computing (TQC) arises from a historical journey blending 19th-century physics with modern quantum insights. Rooted in Lord Kelvin’s fluid dynamics and vortex rings, TQC explores topological phases of matter for quantum info processing. From Kelvin’s smoke ring-atom notion to Peter Tait’s knot theory, linked to quantum fields, TQC offers resilient qubits via topological protection and braided quasiparticles for computation. The convergence of history, math, and quantum mechanics underpins TQC. This article explores the profound role of knot theory in shaping TQC’s development while providing a foundational understanding of TQC principles.

Photo by Armand Khoury on Unsplash

Historical Overview

The genesis of topological quantum computing can be traced back to the eminent physicist, Lord Kelvin. Kelvin’s collaborative efforts with Peter Tait which laid the groundwork for their further explorations. In the year 1857, the duo’s fascination with fluid dynamics phenomena spurred Tait to create the “smoke ring cannon,” a device that produced remarkably stable vortex rings. This novel apparatus piqued Kelvin’s interest in fluid stability, leading to the formulation of the Kelvin Circulation theorem. This theorem delved into the behavior of superfluids, fluids exhibiting minimal friction, and Kelvin conjectured that these vortex rings could potentially represent atoms, thus laying the foundation for the Kelvin Vortex theory of atoms.

Smoke rings. Image Credits [1]

Vortex Theory of Atoms

Kelvin’s intellectual odyssey underwent a convergence of profound revelations, notably stemming from the stability of vortex rings generated by Tait’s ingenious apparatus. This pivotal insight crystallized into the Kelvin Circulation theorem, a foundational principle within fluid dynamics. This theorem posited the perpetual persistence of vortex rings in an environment devoid of friction. Notably, this idea intersected with the emergence of superfluids, specifically superfluid Helium renowned for its non-dissipative attributes. Within this context, Kelvin ventured to speculate on the conceivable role of a medium called ether in upholding electromagnetic waves. In an intriguing proposition, he suggested that vortex rings circulating within this hypothetical ether could metaphorically represent atoms. This conceptual leap marked the inception of the Kelvin Vortex theory of atoms, encapsulating a transformative moment in his intellectual trajectory.

Nuanced Atomization through Vortex Rings

Expanding upon the foundational principles of the vortex theory, Kelvin embarked on a conceptual journey to depict hydrogen’s elemental structure as a simple ring of fluid circulation. This foundational idea served as a launching pad for further exploration, wherein Kelvin extended his insights to imagine more intricate fluid patterns consisting of imperceptible rings, potentially representing distinct atoms. This innovative perspective resonated with the stability highlighted by the Kelvin Circulation theorem.

Kelvin’s Vortex Theory of Atomic Structure. Image Credits [2]

Moreover, Kelvin’s inquiry extended into the domain of molecular chemistry, proposing that the interweaving of two circulation rings might yield heilium molecules. While these suppositions may appear speculative through a modern lens, they were earnestly examined within the scientific dialogues of the 19th century. Prominent luminaries such as Maxwell and Kirchhoff were captivated by these notions, underscoring their significance in shaping the scientific discourse of that era.

Peter Tait’s Knot Theory

In the context of Kelvin’s vortex theory, a divergent perspective emerged through the lens of Peter Tait. Tait articulated reservations concerning the theory’s perceived inability to furnish predictive insights into elemental attributes such as mass, electrical traits, and spectroscopic properties. Rooted in the absence of pragmatic predictions, Tait’s skepticism prompted him to withhold his endorsement of the vortex theory. Nevertheless, despite Tait’s reservations, Kelvin and Maxwell steadfastly championed the theory. Their unwavering commitment to extracting predictive potential from its intricate concepts marked a pivotal phase.

Peter Tait Classification Scheme for Atoms. Image Credits [4]

Tait’s temporary departure from academic pursuits and subsequent alignment with Kelvin and Maxwell signaled a turning point, ushering in divergent intellectual trajectories. Tait’s scholarly journey eventually led him toward an unforeseen pursuit: an exhaustive exploration of knot configurations within the hypothetical ether. This unintended exploration laid the foundation for knot theory, a mathematically enduring endeavor, distinct from atomic theory yet leaving an indelible imprint on the landscape of mathematical discourse.

Fundamentals of Knot Theory

Tait’s venture into knot theory brought forth a central and enduring question: the determination of topological equivalence between distinct knot configurations. This intricate inquiry revolved around discerning whether two different knots could be transformed into one another through deformation without breaking any strands. Mathematicians grappled with this quandary for over next 150 years. In response, a range of tools and methodologies were developed to address this perplexing challenge, with a pivotal protagonist emerging — the knot invariant.

Original Image Variant: [4]

The Significance of Knot Invariants

Within the domain of knot theory, the knot invariant assumes a pivotal role. It constitutes a well-defined set of principles that translate the visual depiction of a knot into a discrete mathematical construct, often materializing as a numerical value or polynomial expression. Of utmost importance, knots demonstrating topological equivalence yield indistinguishable results based on these principles, thereby establishing a discerning criterion. Unique outcomes denote non-equivalence between knots. Illustratively, the Kauffman Invariant exemplifies a compelling instance of a knot invariant, meticulously fashioned to encapsulate the essence of knot equivalence. This transformation imbues the abstract concept of a knot with a quantifiable mathematical dimension, providing a structured analytical perspective that holds the promise of delving into the inherent nature of knots.

Exploration and Formulation of Kauffman Invariant

In the panorama of strategies devised to untangle the enigma of knots, the Kauffman Invariant emerges as a prominent exemplar. Intertwined intricately with the Jones Polynomial, this construct unfolds through a meticulously orchestrated sequence of operations. A pivotal juncture in this process involves the selection of a numerical parameter denoted as “A,” initially variable before assuming a definite value. The ensuing rules emanate from this foundational choice, leading to the substitution of specific knot configurations with meticulously crafted combinations. This method captures the intricate dynamics inherent in knot structures. The harmonious resonance between the Kauffman Invariant and the Jones Polynomial lends profound significance to the former, culminating in a precise formalism that unveils the underlying mathematical essence of knot equivalence.

The Kauffman Invariant

The Kauffman Invariant, an inventive approach for distinguishing knot equivalence, assigns unique polynomials to different knots. It builds upon the Jones Polynomial, initially conceived by Vaughan Jones in 1985 and later expanded by L. Kauffman in 1987. This framework is pivotal for knot theory, offering clear guidelines for knot transformation. It employs a parameter ‘A,’ initially variable and subsequently assigned a value.

The main rules include:

  1. When one string crosses another, the depiction changes to a sum of two representations — a vertical uncrossing with coefficient ‘A’ and a horizontal uncrossing with coefficient 1/A. This rule remains valid even after a 90-degree rotation of the second representation.
  2. A simple loop without crossings is replaced by the factor d.
Original Image Variant: [4]

Example: Evaluating Kauffman Invariant

In the context of topological quantum computing, we illustrate the evaluation of the Kauffman Invariant through a straightforward example involving a simple knot. Initially, a seemingly intricate configuration reveals itself to be a basic loop subjected to folding. The evaluation process begins by identifying overcrossings within the loop, leading us to focus on the lower overcrossing. This specific overcrossing conforms precisely to the first rule of the Kauffman Invariant. Consequently, employing a pictorial representation, we replace the overcrossing with two distinct diagrams, each featuring a coefficient A and 1/A respectively. Further iterations involve the application of these rules to newly generated diagrams, utilizing the third rule which is 90-degree rotation of first rule.

Original Image Variant: [4]

The subsequent steps involve analogous diagram replacements, generating a sequence of configurations associated with specific coefficients. Through successive transformations, instances of open loops are encountered, each of which can be substituted with a designated factor — d for a single loop, d² for two, and so forth. Terms are grouped for simplification based on the following relationships:

A key insight emerges after grouping: the terminal expression features the factor d. This outcome carries profound implications, notably that the Kauffman Invariant of a fundamental loop adheres to the value d as per the second initial rule. Importantly, even as the loop undergoes varying degrees of complexity, the Kauffman Invariant remarkably retains its knowledge of the fundamental value d, thus illuminating the underlying principles of Knot Invariant functionality.

Topological Equivalence

The utilization of the Kauffman rule presents a methodological avenue for assessing the equivalence or disparity between two entities. By subjecting them to the Kauffman rule, distinct polynomials emerge as outcomes, indicative of their non-equivalence. The disparity in these polynomials instantaneously signifies the non-equivalence of the respective knots.

Original Image Variant: [4]

Notably, this principle traces its origin to the pioneering efforts of Von Jones in 1985, wherein a variant manifestation known as the Jones Polynomial was conceived. The notable impact of this contribution is underscored by Jones’s receipt of the prestigious Fields Medal, a testament to his groundbreaking work in the realm of Knot Theory. Subsequent to its inception, L. Kauffman further refined and simplified this concept in 1987, culminating in the publication of the seminal volume “Knots in Physics,” an intellectually intriguing exploration of the profound connections between knot theory and the physical sciences.

Complexities of Knot Invariants

Analyzing a given knot and determining its Kauffman invariant raises a critical question. Unfortunately, the computation of most Knot Invariants encounters exponential complexity, a challenge that is well-recognized. The essence of this challenge becomes clear upon closer inspection. In a prior example, a seemingly complex loop with only two crossings yielded a manageable number of diagrams. Yet, when applied to knots with three crossings, diagram count exponentially jumped to eight, escalating to 16 for four crossings, and so forth. In a visual representation of 100 crossings, the diagram tally surges to an overwhelming 2¹⁰⁰. Even the swiftest computational systems would require eons to resolve such complexity. Consequently, evaluating the Kauffman invariant for a knot of this intricacy stands as an exceptionally demanding computational task.

Original Image Variant: [4]

Topological Quantum Field Theory

The formulation of Quantum Mechanics, as expounded by Richard Feynman, emphasizes the calculation of amplitudes through the path-integral approach. In this method, the amplitude of transitioning between initial and final configurations, or in the context of the partition function, involves summing over all conceivable space-time histories of the process. Notably, the discussion pertains to a two-dimensional system where time is depicted vertically. The focal point of this exploration lies in the realm of Topological Quantum Field Theory (TQFT), wherein amplitudes exclusively hinge upon the topological properties of the underlying process.

Topological Equivalence and Amplitudes in TQFT

In the visual representation, time is portrayed vertically within a disk, encapsulating a space-time process. Within this framework, distinct points in time reveal the emergence, movement, and recombination of two particles and two holes, originating from the vacuum. This complex process necessitates inclusion in the path integral, adhering to Feynman’s dictum of accounting for all potential occurrences. Crucially, a salient aspect of TQFT becomes evident as we compare the amplitudes of this process and an alternative representation, wherein particles navigate each other’s paths in a different yet topologically equivalent manner. This phenomenon, while unconventional, demonstrates that solely the topological arrangement matters — how the entities traverse one another — rather than traditional factors like their velocities or interaction strengths.

Original Image Variant: [4]

Knot Invariants and Ed Witten’s Contributions

A profound implication of TQFT is its intimate connection to knot theory, as masterfully illuminated by Ed Witten, whose pioneering insights earned the esteemed Field Medal jointly with Von Jones. Within the framework of knot theory, knot invariants are mappings that depend solely on the topological configuration. Analogously, in TQFT, amplitudes exhibit a comparable behavior, unswervingly stemming from the topology of the input configuration. Notably, Ed Witten unveiled a remarkable association between certain TQFTs, such as the chern-simons variant, and the Jones Polynomial — a renowned mathematical construct in knot theory. Concretely, Witten’s work asserts that the amplitude of a process in chern-simons TQFT finds its expression through the Jones Polynomial of a corresponding knot. This striking linkage underscores the deep interplay between seemingly disparate mathematical realms. (Witten, 1989)

Proposed TQFT Computer

Upon encountering the intriguing result by Mike Freedman based on Ed Witten’s insights, a novel avenue for exploring Topological Quantum Field Theory (TQFT) emerged. Freedman envisioned a scenario where a laboratory possessed a Topological field theory, enabling the measurement of amplitudes and, consequently, the deduction of Knot Invariance by observing specific occurrences. The significance of this proposition lies in its ability to resolve an exponentially challenging problem in polynomial time. Calculating Kauffman invariants, a task known for its computational complexity, becomes amenable through this methodology. The ability to measure amplitudes to discern the Jones Polynomial or the Kauffman Invariant of a Knot introduces parallels to the early prehistory of quantum computation.

Flashback of Prehistory of Topological Quantum Computing

Yuri Manin’s 1980 insight recognized quantum mechanics’ computational superiority over classical mechanics. Complex quantum systems like the Hubbard Model defy classical simulation, prompting the construction and measurement of these systems in the lab for more feasible solutions. This raised questions about quantum system capabilities and error impacts. Freedman’s proposal adds a new dimension: building a TQFT-based computer, with promising potential for universal quantum computation, robust error suppression, and empirical support for topological systems’ existence. Thus, Alexi Kitaev and Mike Freedman pioneered this field.

Topological Phases of Matter

A topological phase of matter is a physical system that embodies a realization of a topological quantum field theory. Such systems are termed topological phases of matter when described by such field theories.

Multiple Quasiparticles

A crucial aspect of nontrivial topological field theories is the degeneracy of the ground state in the presence of quasiparticle and quasihole excitations. Consider a physical system represented as a disk containing two particles and two holes. Despite fixing the positions of the particles, there exist two orthogonal wave functions describing the same energy state. This degeneracy arises due to distinct topologically inequivalent space-time histories.

Original Image Variant: [4]

For instance, arranging the two particles apart in a certain manner yields a space-time history distinct from another configuration. Consequently, ground states 1 and 2 of the system can be inequivalent, dependent on the topology of their space-time histories. Importantly, the dynamics within the bulk, disregarding edges, dictate that braiding quasiparticles, or manipulating their positions, leads to transitions between these degenerate ground states.

Original Image Variant: [4]

An intriguing question arises: How can we discern the distinctiveness of these two ground states when compared to conventional systems like superconductors or insulators, where ground state wave functions are solely determined by particle positions, regardless of their history?

Preparing Quasiparticle “Ket” and “Bra” States

The preparation of quasiparticle “ket” states encompasses various methods, involving intricate modifications to space-time histories to obtain particles in distinct ground states.

Original Image Variant: [4]

The subsequent creation of “bra” states, achieved through time reversal of kets, effectively returns particles to the vacuum. This process involves extracting particles from the vacuum to form kets and subsequently returning particles to the vacuum to yield bras.

Original Image Variant: [4]

Determining the overlap between ground states in a topological theory involves combining kets and bras to create knots. Applying the Kauffman rules to these knots, considering two loops in the system, reveals a value of d². This observation holds true for a TQFT of Jones type, where A loop corresponds to d, with d representing a specific TQFT parameter.

Original Image Variant: [4]

Consequently, ground states are distinct unless d equals 1, indicating linear independence. Calculating overlaps between states, and any braiding, involves evaluating the Kauffman invariant of corresponding knots, encapsulating the rules governing topological phases of matter.

Original Image Variant: [4]

Topological Phases of Matter Rules

The definition of topological phases of matter entails systems that, at extended distances and lower energies, are described by topological quantum field theories. The core principles governing these phases are encompass the degeneracy of ground states in the presence of quasiparticles and/or quasiholes, the exclusive influence of braiding quasiparticles on transitions between these degenerate states, and the absence of local operators capable of mixing multiple ground states. These principles collectively elucidate that quasiparticles exhibit non-Abelian statistics, underscoring the intricate nature of topological phases of matter.

Statistics and Non-Abelian Phenomena

To elucidate non-Abelian statistics, a foundational understanding of particle statistics is necessary. In standard quantum mechanics, exchanging identical particles leads to either the original state or its square roots: +1 (bosons) and -1 (fermions). However, this notion changes in two dimensions due to entanglement resulting in knot-like space-time histories. The counterintuitive outcome emerges that exchanging particles twice need not yield identity. This peculiarity, absent in three dimensions, fosters the emergence of non-Abelian statistics.

Bosons

Fermions

However, this notion changes in two dimensions due to entanglement resulting in knot-like space-time histories. The counterintuitive outcome emerges that exchanging particles twice need not yield identity. This peculiarity, absent in three dimensions, fosters the emergence of non-Abelian statistics.

Original Image Variant: [4]

Non-Abelian Statistics

Consider four fixed, identical particles on a plane. Assume a ground state degeneracy with two possible states.

Original Image Variant: [4]

When the particle positions are fixed, the wavefunction is a linear combination of these states. Adiabatically dragging the particles creates a new superposition (a sub f and b sub f) with a 2x2 unitary matrix linking initial and final states.

Non-Abelian statistics asserts that this matrix depends solely on the topology of the braid, irrespective of speed or smoothness. The non-commutativity of matrices underpins the term “non-Abelian.” Exchanging particles, thus altering the braid, leads to distinct unitary operators, an essential phenomenon with relevance to quantum computation.

Implications for Quantum Computation

Non-Abelian statistics holds implications for quantum computation. Viewing a and b as 0 and 1 suggests qubit potential. This approach offers a unique advantage — local operators cannot mix multiple ground states, rendering qubits resilient against noise-induced errors. Such local noise processes remain confined, preserving qubit superpositions and safeguarding against decoherence. The existence of topological phases of matter is substantiated, notably exemplified by the Quantized Hall Effect. This enigmatic behavior holds promise for quantum computing applications, motivating exploration into harnessing non-Abelian statistics to create robust and error-resistant qubits.

Quantized Hall Effect

The initial discovery of the Hall Effect in 1879 by Edware Hall established that a magnetic field applied to a metal induces a linear Hall voltage perpendicular to the current flow. A century later, a closely related experiment was conducted with significant modifications: operating at extremely low temperatures, utilizing a two-dimensional setup with substantial disorder, and employing strong magnetic fields. Despite these alterations, the fundamental principle remained the same. However, distinct observations emerged — longitudinal resistance displayed intricate fluctuations, reaching zero at multiple points, while the Hall resistance demonstrated plateaus.

Image Credits [5]

Notably, each plateau in the Hall resistance corresponded to a minimum in the longitudinal resistance, signifying different phases of matter within the experiment. Remarkably, these phases constitute topological phases of matter, some of which possess nontrivial topological attributes, suggesting the potential for quantum computation. Our focus centers on a specific region of intriguing physics within this context.

Exemplar Quantum Hall Sample

One of the most exceptional physical samples for investigating the quantum Hall effect boasts a remarkable electron mobility of 31 million cm²/V-sec, enabling electrons to traverse nearly a millimeter without scattering. This sample is meticulously cooled down to 9 mK, unveiling a rich tapestry of phases of matter. These phases are believed to belong to the Kauffman invariant class of topological phases, where the distinctions between plateaus lie in slight variations of the constant A within the Kauffman Invariant.

Image Credits [6]

While considerable numerical analysis has affirmed these properties in the realm of the quantum Hall effect, experimental demonstration has proven more challenging than anticipated due to the demanding conditions of the low-temperature environment. Ongoing efforts extend the pursuit of similar physics to various systems, encompassing diverse domains such as quantum Hall states, chiral p-wave superconductors, superfluid He-3 films, cold atomic gases, atomic lattices, Josephson junction arrays, and bismuth-antimode super junctions, revealing the interdisciplinary nature of this captivating phenomenon.

Fractional Quantum Hall Fluid

Quasiparticles within dissipationless environments, such as the Fractional Quantum Hall Fluid, exhibit intriguing behavior that involves the braiding of vortices within a superfluid backdrop. This concept harks back to Kelvin’s visionary ideas from over a century ago, wherein a pristine and undisturbed fluid, akin to a superfluid, is present. In the context of the Quantum Hall effect, which unfolds in a two-dimensional space, these vortices persist as stable entities due to the absence of dissipation within the fluid medium. This realization aligns with Kelvin’s groundbreaking insights, underscoring the enduring significance of his contributions in this contemporary realm of study.

Original Image Variant: [4]

Topological Quantum Computation

The path toward achieving Topological Quantum Computation involves harnessing the degenerate ground states resulting from the presence of quasiparticles as qubits. The execution of unitary operations, or gates, on these ground states is realized through the intricate process of braiding quasiparticles around one another. Each distinct braid configuration corresponds to specific computational functions.

Original Image Variant: [4]

A conceptual quantum circuit — akin to a search or factoring circuit — can be reoriented, with time represented vertically, and mapped onto a 2-dimensional topological quantum field theory populated with particles. This mapping allows the translation of the quantum circuit’s operations into braids, where the manipulation of particle positions replicates the quantum computation.

The appeal of this approach is rooted in its robustness against noise-induced perturbations. When noise interacts with particles, causing fluctuations and perturbations, the qubit space remains unchanged as long as the same braid structure is maintained. A critical point arises: the topology of the braid must remain intact, whereas minor perturbations have negligible impact. This safeguarding mechanism, termed topological protection, underscores the appeal of this avenue for quantum computation.

Initialization and Measurement of States

In this paradigm, qubit states can be initialized by “pulling” particle pairs from the vacuum, setting the stage for computation. The measurement of states entails attempting to return particle pairs to the vacuum. The ability to manipulate qubits through braiding while minimizing vulnerability to noise-induced errors holds substantial promise. This distinctive approach, centered on topological protection, resonates with researchers exploring quantum computation and fuels the enthusiasm surrounding the pursuit of Topological Quantum Computation.

Artist conception of the device

In this artist’s conception, a 2D system is employed, featuring numerous electrodes atop the structure.

Original Image Variant: [4]

By charging up electrons, particle-hole pairs are extracted from the vacuum, essentially a superfluid.

Original Image Variant: [4]

Employing a scanning tip device, one can selectively manipulate individual particles, entangling them through intricate braids before returning them to their original positions.

Original Image Variant: [4]

Subsequent annihilation of the particles serves as the readout.

Original Image Variant: [4]

Interestingly, only one particle needs to be moved to execute computations, a remarkable result supported by a fundamental theorem. This property is a defining characteristic of all topological quantum computers, whereby all computations can be accomplished by moving a single quasiparticle. The potential implications of this feature are substantial and may lead to significant advancements in the field of quantum computing.

Advantages

1.Noteworthy Noise Resilience: Topological quantum memory demonstrates a notable capacity to withstand the disruptive effects of noise, enhancing its suitability for robust quantum information storage.

2.Inherent Topological Robustness of Operations: The executed operations, or gates, in the realm of topological quantum computing exhibit a natural and inherent resilience due to their topological nature, reinforcing the stability and reliability of computational processes.

These distinctive features highlight the inherent advantages of topological quantum computing, offering a promising avenue for addressing critical challenges in quantum information processing and paving the way for advancements in quantum technology.

Conclusion

The concept of Topological Quantum Computing (TQC) emerges from a fascinating historical journey that intertwines the visionary ideas of 19th-century physicists with the cutting-edge insights of modern quantum physics. Rooted in Lord Kelvin’s intricate exploration of fluid dynamics and vortex rings, TQC delves into the realm of topological phases of matter and their potential applications in quantum information processing. This captivating narrative spans Kelvin’s intriguing conjecture that smoke rings could correspond to atoms, to the development of knot theory by Peter Tait, and the subsequent linkage between knot invariants and quantum field theory. TQC offers the promise of resilient qubits, protected against decoherence by their topological nature, while computational operations are performed by braiding quasiparticles to transform quantum states. This convergence of historical curiosity, mathematical rigor, and quantum mechanics forms the foundation of Topological Quantum Computing, a field poised to reshape the landscape of quantum technology.

References

Photo by Sigmund on Unsplash

[1] A. Boyd, “№3184: The Knots of Peter Guthrie Tait,” Engines of Our Ingenuity. [Online]. Available: https://uh.edu/engines/epi3184.html

[2] A. Sossinsky, Knots: Mathematics with a Twist, HAP, 2002; W. Thomson, Proc. Roy. Soc. Ed., 1867, 6, 94–105.

[3] D. S. Richeson, “Why Mathematicians Study Knots,” Quanta Magazine, Oct. 31, 2022. [Online]. Available: https://www.quantamagazine.org/why-mathematicians-study-knots-20221031/

[4] S. Simon, “Topological Quantum Computing (Part 1) — CSSQI 2012,” Institute for Quantum Computing, Nov. 23, 2012. [Online]. Available: https://www.youtube.com/watch?v=FAiiXp9IoBk

[5] Roostaei, Bahman. (2007). Novel phenomena in confined electronic systems.

[6] J. S. Xia et al., “Electron Correlation in the Second Landau Level: A Competition Between Many Nearly Degenerate Quantum Phases,” Phys. Rev. Lett., vol. 93, no. 17, p. 176809, Oct. 2004. [Online]. Available: https://link.aps.org/doi/10.1103/PhysRevLett.93.176809

--

--

FEROZ AHMAD فيروز أحمد
Quantum Engineering

Quantum Computing | Philosophy | Deep Learning | Science | Economics