“And yet it moves not…”- Reflections on the Quantum Zeno ‘Paradox’

— To collapse or to decohere

Cogito (Magazine)
17 min readJul 7, 2023

(by Swagato Saha) (Cogito — Issue 5, July ‘23)

‘Unitary, Immovable, Unchanging Being’ — Zeno’s Arrow

Foreword: “And yet it moves” (Italian: Eppur si muove) is a phrase attributed to the Italian mathematician, physicist and philosopher Galileo Galilei (1564–1642) in 1633 after being forced to recant his claims that the Earth moves around the Sun, rather than the converse.
Zeno, a pre-Socratic Greek philosopher of 5th century BCE, in his attempts to prove the Parmenidean metaphysics of a ‘Unitary, Immovable, Unchanging Being’, is believed to have set up the Arrow Paradox — ‘motion is impossible’ — at no instant in time can an arrow occupy multiple places, therefore, at no instant does it leave its initial position, and must remain there eternally.
While the mathematics of infinitesimal calculus proves such a position problematic, one can imagine, in the light of developments of 20th century Quantum Theory — a partisan of old Copenhagen valiantly protesting — ‘And yet it moves not.’

Introduction: Originally conceived of in its complete scope by Sudarshan and Misra, the aptly named Quantum Zeno Effect refers to the (seemingly, at least) paradoxical phenomenon of continuous observation of a quantum system indefinitely slowing down its decay (or more generally, time evolution). However, there are important precursors to this development, which for purposes of a more meaningful engagement, must be sketched out. Sudarshan and Misra’s seminal work [1] comes in the more or less immediate aftermath of [2] Ghirardi et al’s “Does the decay of a particle depend on its lifetime?”, to which the response of the former is a somewhat surprising no (more on this later). Equally insightful are Turing’s intimations in ’54, reported by his student and colleague Robin Gandy, which not only mathematically demonstrate the strange implications of the Zeno Effect as entailed by Quantum Theory, but may also be seen as indicative of Turing’s attempts to conceptualise Quantum Computing in his final years (more on this later). The original paper, insightful though it is, makes no note of the indeterminacy principle (time-energy indeterminacy), which was brought to attention in Ghirardi’s ’79 paper [3] — where it is so argued that properly including energy-time uncertainty relations in the context of ‘measurement’ resolves the paradox, proving it invalid as unphysical, and revealing instead the dependence of decay rate indeed on the lifetime of the system under study. To put this in context, it appears consistent with arguments that insist on a fundamental distinction between what’s called the Quantum Zeno Effect (a well-established experimentally demonstrable slowing of decay) and the Quantum Zeno Paradox (perhaps an incomplete representation of the former, which on account of its incompleteness, leads to absurd ends — ‘complete freezing’). Although one is quite in line here to argue that these distinctions are themselves ad-hoc and questionable, both from an experimental perspective (Itano [4] sees no essential difference between the process of transition, understood as Q.Z. Effect, and that of decay, understood as Q.Z. Paradox), and from a theoretical one, where of course there is no formal difference between the mathematical description of a decay process and that of an otherwise time evolution. In addition, there also exist arguments, appealing to this Quantum Zeno phenomenon among others, which see the time-energy uncertainty relation as less universally valid, in some sense secondary to the more fundamental, canonical position-momentum uncertainty. Finally, the fact that the original paper is often seen as punctuating the shift in foundational quantum theory from the Copenhagen ‘blackbox’ interpretation of ‘Measurement’/’Collapse’ to a more mechanistically motivated theory of ‘Decoherence’, suggests that we are dealing here with fundamental questions of foundation, which cannot be resolved by brute experimentation (since all interpretations of Quantum Theory are mathematically, predictively equivalent), and instead require a certain philosophical engagement. In other words, the legitimacy of the Quantum Zeno Paradox depends majorly on what interpretation of Quantum Mechanics one deems proper — in a way this also indicates how for the sake of completeness of Quantum Theory where such paradoxical conundrums may be unambiguously resolved, one is required to take the oft-neglected aspect [“Shut up and calculate!”] of interpretation of Quantum Theory as equally constitutive, if not even more. This section, focusing more on the theoretical end of things, will address in greater detail the aforementioned arguments.

‘Sudarshan and Misra’ — As is the case in Classical Physics — where an analysis of possible symmetry relations or lack thereof may reveal the legitimacy of say, the system’s Hamiltonian — or in Thermodynamics — where change in Free Energy is a check for the feasibility of a process (can or can’t) — so it is in Quantum Theory that the existence of certain information (observable) pertaining to the quantum system depends on the definability of the corresponding Operator. As such, the first few sections of the paper deal with whether it is permissible within Quantum Theory to define an operator for continuous measurement — whereas usual instantaneous measurement may be understood as discrete points over a probability density function, continuous measurement, by extension, lacked a ‘quantum-theoretic algorithm’ and prima facie could not be ascribed any physical-operational meaning.
The authors denote the latter as P(t) and the former as p(t) and much of the mathematics that follows, is an analysis of associated holomorphisms and existence lemmas. Yet, it should not be understood as a feature of general Quantum Theory that allows for the easy definition and malleability of Operators — (one may not arbitrarily go on defining any number of Operators) — without the least consideration of how (un-)physical they may be. In fact, the authors note how from the standpoint of contemporary physics, since there is no limiting concept of ‘an elementary interval of time’, in principle, the idea of continuous measurement may not be ruled out a-priori as unphysical.

At this level, one need not even explicitly invoke Quantum Zeno Paradox as motivating this line of thought. Rather, as the authors note, it is motivated sorely by concerns of ‘completeness’ — in order to be a complete theory, quantum mechanics must either rule out the possibility of continuous measurement by positing an elementary unit of time (‘chronon’ — but this is part of a later hypothesis and does not feature in the paper), or provide precise algorithms for cases of continuous measurements. And it is thereby in following the prescription of the mathematics, relatively unambiguous, that we arrive at the paradoxical junction of so-called Zeno Effect.
“The quantum Zeno effect is a feature of quantum-mechanical systems allowing a particle’s time evolution to be slowed down by measuring it frequently enough with respect to some chosen measurement setting.”
(We will present a simplified, but equally actual, version of the mathematical description of the phenomenon. This version is often taken as a point of reference in standard literature, as properly distilling the content the authors wished to convey.)

With this, we turn our attention to the concluding discussion of Sudarshan and Misra’s entry. It is unsurprising that this section presents a series of highly non-trivial observations and arguments — not the least of which is the paradoxical discovery, as the mathematics demonstrates, of the relative independence of the “observed lifetime” of a(n) (unstable) particle over its ‘lifetime’ or ‘Hamiltonian’, but depending instead on ‘the details of the observation process’, such as measurement frequency. At that time at least, there were no empirical indicators that this is so.

To the extent that one is not opposed to imputing to ‘Operators of continuous measurement’ definite physical-operational meaning, one is required to account for the deviations from predicted behaviour as observed in the bubble chamber experiment, for example, that the authors refer to. While it may be possible to argue that all observations are strictly ‘non-ideal’ and do not really approach the theoretical limit (continuum) in which the Zeno Paradox is said to be manifest, nonetheless there should be a tendential increase in the observed lifetime of the particle with the increase in measurement frequency. (Here, the ’90 experiments on Be3+ ions by Itano et al may be cited as experimentally verifying this prediction afterwards. Although, this too is not without controversy.)

While the authors show interest in conceiving of possible experimental set-ups, capable of falsifying the set of lifetime predictions, they additionally consider the more theoretical-foundational question —
“Natural though it seems, it is wrong to assume that the temporal evolution of a quantum system under continuing observation can be described by a linear operator of time-evolution such as T(t). It can be described only in terms of a persistent interaction between the quantum system and the classical measuring apparatus. When this is done the quantum Zeno’s paradox will either disappear or if it survives, at least, it will be understandable as the drastic change in the behaviour of the quantum system caused by its continuous interaction with a classical measuring apparatus. This point of view is at present only a program since there is no standard and detailed theory for the actual coupling between quantum systems with classical measuring apparatus. A beginning in this direction is made in a forthcoming paper.”

In other words, what is at stake here is the very interpretation of the wave-function and associated collapse — known generally as the ‘Measurement Problem’. While the Copenhagen Interpretation, championed by Bohr and also Heisenberg, takes at face value the actuality of the ‘wave-function’ and its ‘collapse’ (/the fundamental description of reality is in the form of a probability distribution that describes the unitary evolution of superposed states — superposition being a purely quantum mechanical effect with no classical correlate — and wave-function collapse is non-unitary; and unlike as in Classical Physics, where Laplace’s Demon demonstrates the in-principle deterministic closure of reality; and in-principle even sets basic limits to what can be known/), interpretations thereafter, manifold as they are, try to get around this impenetrability of the collapse process — there must be a mechanism for collapse. [Suffice it to recall the famed polemic between Bohr and Einstein.]
‘Decoherence’ encompasses the class of interpretations which understand so-called ‘collapse’ as mediated by the ‘interaction’ of an ‘open quantum system’ with its classical environment in the form of the measuring apparatus. That is, ‘measurement’ of a quantum system means it is no longer ‘closed’, and ‘collapse’ is the Copenhagen name for this process by which it is assimilated within, or gives rise to the Classical domain. Though the status of Decoherence itself as decisively resolving the ‘measurement problem’ is very much up for debate, if viewed through the lens of the Zeno Paradox, it does appear that Decoherence is generally able to avoid such paradoxical limits — where continuous observation is not to be treated in the form of a linear operator, as is suggested by Sudarshan and Misra.
(As per Decoherence, since any act of ‘measurement’ minimally couples the quantum system to its classical environment, it is not ‘closed’, the Zeno limit does not arise as the system under continuous observation is classically decohered.)

Finally, certain interesting detours are considered by the authors, as shedding light on some of the uncomfortable prospects opened up. ‘Close parallels’ between Zeno Paradox and ‘findings of sensory deprivation in the study of consciousness’ may raise some amount of concern as moving towards some kind of ‘Quantum Mysticism’ even, but this need not be the case if such studies are systematically undertaken. Afterall, if one remains within the ‘collapse’ picture, then as Von Neumann notes, the mathematics of quantum mechanics allows the collapse of the wave function to be placed at any position in the causal chain from the measurement device to the “subjective perception” of the human observer. To somewhat demystify things, let us bear in mind findings about the Quantum Zeno Effect playing a crucial role for avian navigation processes. [5]

What is interesting from a historical perspective, is the often-neglected role paradoxes play in the formation of scientific theory. While it is typically conveniently forgotten and scientific progress is held as mediated chiefly by experimental findings, there are indeed important exceptions. [Galileo’s apparently paradoxical discovery of uniform acceleration of a falling body, independent of its mass, also follows from a logical contradiction.] The Quantum Zeno Effect may be regarded as one such focal point, where a moment of speculative theoretical contemplation, opens up new scientific questions across domains of foundation-interpretation, theory-building, and experiments.

‘What of Uncertainty?’ — “It is shown that, if the uncertainty relations are properly taken into account, the arguments leading to the paradox are not valid. Moreover, by the same kind of arguments, it is shown that the dependence of the measured lifetime on the frequency of the measurement processes, even though present in principle, is practically not detectable.”

Taken from Ghirardi et al’s ’79 paper, where they primarily investigate the small-time behaviour of decay/non-decay probabilities, one finds here a rather pointed refutation of the Sudarshan and Misra’s original thesis. We shall once again follow a simplified, but hopefully equally proper, rendition of their mathematical arguments.

Therefore, what the time–energy indeterminacy relation (part of the indeterminacy principle) entails is that an increased frequency of the measurement process must correspondingly decrease the time duration of the measurement itself. But this decrease in the ‘measurement time’ implies that the energy spread (dispersion) of the state in which reduction occurs becomes increasingly large. The deviations from the exponential decay law for small being crucially related to the inverse of the energy spread, the region in which the deviations are appreciable shrinks when one makes the measurement process duration shorter. The limiting case of an infinitesimal time interval hence corresponds to an infinite energy spread, such that an explicit evaluation of these two competing tendencies shows that it is contentious, without taking into account this basic fact, to deal with the actuality of the Quantum Zeno Effect. As such, the paradox is dissolved as unphysically violating Quantum Uncertainty. Similar conclusions are agreed upon in other works too. [6]

Unfortunately, this does not provide a conclusive verdict and things get complicated here. In his 2019 publication [7], Urbanowski argues that “[…] within the Quantum Mechanics of Schrödinger and von Neumann, the status of these relations cannot be considered as the same as the status of the position–momentum uncertainty relations, which are rigorous. The conclusion is that the time–energy uncertainty relations cannot be considered as universally valid.”

This means, that there are ways to remain within the Copenhagen interpretation and maintain the actuality of the paradox, without falling into inconsistencies. It is all the more interesting that Urbanowski, who systematically refers to the Zeno Paradox, restricts his intervention to the Quantum Mechanics of von Neumann and Schrödinger. This draws further attention to the sensitivity of interpretation — it can no longer be construed as a safe domain for philosophical contemplation, disjoint from theory; it has actual consequences for what is to be held as theoretically sound.
A full engagement in these more recent developments is beyond our current scope.

It also seems relevant here to criticize the series of attempts that superficially distinguish the Zeno Effect from the Zeno Paradox — Itano discusses at length several such distinctions. In light pf how such distinctions fail to arise in the domain of experiment as also that of theory, and given their plurality and mutual inconsistencies, it indeed seems reasonable, as Itano too expresses, to abandon this prospect of drawing such a line. Zeno paradox is but the limiting case of Zeno Effect.

‘Quantum Zeno Effect, ‘Anti-Zeno’, and Adiabaticity’ — It is often insightful when encountering in Quantum Theory some seemingly absurd result, to look for possible classical analogues. Classical analogues of the Zeno Effect, though obviously not equivalent, have been variously identified — from LC circuits to coupled pendulums. [8]
In contrast, the ‘anti-Zeno’ effect denotes accelerated decay rates of a quantum system subject to slow measurement. Kitano pursues possible similarities between Quantum Adiabaticity and the anti-Zeno Effect, in terms of their associated time evolutions in rotating frames. [9]
“It is very interesting to view the quantum Zeno effect from a rotating frame in which the free dynamics of the system is cancelled. In this frame, a series of projections is performed toward slightly different states one after another and the system evolves after these states. This effect was discussed by Aharonov and Vardi and was called the inverse quantum Zeno effect by Altenmuller and Schenzle.”
As such, the anti-Zeno Effect (also dubbed the ‘watchdog’ effect) denotes the time evolution of a system induced by its continuous coupling to its environment. It may not always be possible to clearly tell apart Zeno and anti-Zeno effects in an experimental setting. Since the anti-Zeno Effect primarily induces time evolution — (or in cases where Zeno Effect entails a slowing of spontaneous decay, the anti-Zeno Effect accelerates decay instead) — it finds great promise of application in the guided transformation and control of quantum systems, demonstrated in a subsequent experiment undertaken by Kitano.
Indeed, the Quantum Adiabatic Theorem too allows for similar transformations. “Gradually changing conditions allow the system to adapt its configuration, hence the probability density is modified by the process. If the system starts in an eigenstate of the initial Hamiltonian, it will end in the corresponding eigenstate of the final Hamiltonian.” [10]

Quantum Adiabatic Theorem

Despite the conceptual parallels, Kitano notes — “[…]the long-time contributions to the transition probability have different forms; the former shows a 1/(T^2) behaviour while the latter scales 1/T.”

Let us briefly consider the long-term behaviour of the Quantum Zeno Effect. While Ghirardi’s ’79 paper is focused primarily on small-time behaviour, studies to this end were undertaken by Paolo Facchi and Marilena Ligabò [11]. Analyzing the mathematical behaviour of the quantum probability ( (p(t)) ^ N ), when both the time t and the number of measurements N go to infinity.
Consequently observed is the dependence of the value of p on a real parameter α:
1)for 0≤α≤1/2, the system is frozen in its initial state and the QZE takes place;
2)for 1/2<α<1, the system has a classic behaviour;
3)for α=1/2, the probability has a gaussian behaviour;
4)for α≥1 the limit probability becomes sensitive to the spectral properties of the state ψ.

Coming at a much later time (2017) than the original discovery, this analysis does resolve some of the controversies — note how the decay process for lower values of the parameter evolves as per QZE, driven by measurement frequencies; yet for higher values of the same parameter (>1), the lifetime of the quantum system becomes the driving factor.

‘Speaking of Paradoxes…’ — Home and Whitaker [12] argue for the legitimacy of QZE and its properly paradoxical status irrespective of ambivalences about the ‘collapse’ picture. In particular, they draw parallels between Anti-Zeno Effect and Bell’s Theorems in light of how they both warrant a general abandoning of ‘local-realism’ in physics — a measurement may very well be ‘non-local’ and yet have a lasting impact on the dynamics of the quantum system under study. This means that QZE may not be merely ‘explained away’ by appealing to decoherentist interpretations. However, in the scope of our current study, we would like to argue from within the Copenhagen-Collapse interpretive paradigm — in the sense, the Copenhagen interpretation still seems to be an appropriate ‘first order’ of interpretation, to which modifications may be considered. Such is the position undertaken by David Chalmers [13] — although QZE does force some changes.
A paradox, in the strict formal-logical sense as approached by Bertrand Russell, is necessarily self-referential [‘Barber of Seville’]. Chalmers, therefore, projects onto the Zeno Paradox a similar self-referential form — ‘Is Measurement itself an observable?’
While it is close to commonplace today to argue for Decoherence, even more so in light of the Zeno Paradox [14], Chalmers insists on the old Copenhagen premise — with modifications in the form of Penrose’s objective collapse, Pearle’s interpretation, etc. Moreover, it seems, following Sudarshan and Misra’s insights, that for the sake of completeness of Quantum Theory, it cannot presuppose the existence of the Classical domain — in other words, the Classical domain must one way or another ‘emerge’ out of it. To the extent, that such a presupposition is effective in Decoherence, this is one reason to still advocate the reality of ‘collapse’.

‘Conclusion: Turing Paradox’ — Based on some of his late interviews and in the letters of his friend and student Robert Gandy, [15] Turing’s own development of the Quantum Zeno Paradox may be traced, as early as 1950.

Having formalised the concept of ‘computability’ qua ‘effective calculability’ in 1936, Turing’s theoretical interests since variously involved contemplations on the ‘Uncomputable’. What sets apart Turing’s work from the bulk of 20th century Logic is his simultaneous engagement in both logical and physical domains. It seems just therefore, that he sees in Quantum Theory a potential refutation of the Church-Turing thesis — understood in its stronger sense as — All naturally occurring mechanisms are computable (‘effectively calculable’) by a Universal Turing Machine.
Also elsewhere, he considers the possibility of natural processes that are potentially ‘undecidable’ (Uncomputable a UTM), and is thus motivated to pursue a more universal theory of Computation thereby.
Yet, as Hodges notes, Turing is attentive to the Copenhagen conundrum — the old dualism of unitary superposition and non-unitary ‘black-box’ collapse — and it is the latter that draws his attention as properly ‘uncomputable’ (a position taken up by Penrose as well). Interesting here is the fact that Quantum Computing, which so far has restricted itself to the former unitarity, while introducing polynomial reductions for classically exponentially scaling algorithms, does nothing whatsoever to the Computability hierarchy [16]. In other words, as David Deutsch also observes somewhere, what is unsolvable for a classical Church-Turing type machine is unsolvable for the quantum computer as well. The prospect of tapping into the latter non-unitary collapse, as considered by Turing, is indeed enticing. Thus, here one finds somewhat unexpected support for the ‘collapse’ camp.

In conclusion, there are at least two well-founded reasons, to our understanding, to continue to advocate some form of ‘wave-function collapse’. Besides the reproach against Decoherence [17] as not adequately solving the ‘measurement problem’, the question of ‘completeness’ and ‘fundamentality’ of Quantum Theory is itself a major catalyst. Tied to this, is the quasi-formal nature of paradoxes (such as QZE) that arise within the interpretive domain of ‘wave-function collapse’ [ [18] utilizes a version of the ‘Wigner’s friend’ paradox]. Against this, it appears insincere to appeal to some pre-given sense of a classical world to which quantum mechanics is later coupled.

Secondly, referring to the paradigm hinted at by Turing, one may argue that the ‘right’ way to think of the classical phenomenological world as ‘emerging’ out of the more fundamental quantum order of reality is still best rendered, best possible within the dual categories of ‘unitary superposition’ and ‘non-unitary collapse’. Furthermore, the ‘irreversibility’ of ‘collapse’ allows a consideration for fundamental lack of time-reversal symmetry within laws of physics (In general, laws of physics work the same way forwards or backwards in time. ‘Wave-function collapse’ shows a singular violation of this, at the microscopic level.)

Besides the aforementioned more ‘foundationally-engaged’ readings, it is indeed also possible to adopt a more modest ‘experimentally-oriented’ stance — questions of interpretation aside, QZE (as distinct from Zeno’s Paradox) is a theoretically justifiable, empirically tested phenomenon that may improve our understanding of Quantum Theory and also enable development of new technology. In fact, much of the renewed interest in QZE is driven by its promises for Quantum Computing.

REFERENCES:

1. B. Misra, E.C.G. Sudarshan (1977) “The Zeno’s Paradox in quantum theory” — J. Math. Phys.
2. Degasperis, A.; Fonda, L.; Ghirardi, G. C. (1974) — “Does the lifetime of an unstable system depend on the measuring apparatus?”.
3. Ghirardi, Omero, Weber, Rimini (1979). “Small-Time Behaviour of Quantum Nondecay Probability and Zeno’s Paradox in Quantum Mechanics”.
4. Itano (2009) — “Perspectives on the quantum Zeno paradox”.
5. “The quantum Zeno effect immunizes the avian compass against the deleterious effects of exchange and dipolar interactions” — Dellis, Kominis (2011).
6. “Decoherence and the Quantum Zeno Effect” — Anu Venugopalan and R. Ghosh (1995).
7. “Critical look at the time–energy uncertainty relations” — K. Urbanowski (2019).
8. “LC OSCILLATING CIRCUIT AS THE SIMPLE CLASSICAL ANALOG OF THE QUANTUM ZENO EFFECT” — Vladan Pankovic (2009).
9. “Quantum Zeno effect and adiabatic change” — M. Kitano (1996).
10. T. Kato (1950). “On the Adiabatic Theorem of Quantum Mechanics”
11. “Large-time limit of the quantum Zeno effect” — Facchi, Ligabò (2017).
12. “A Conceptual Analysis of Quantum Zeno; Paradox, Measurement, and Experiment” — Home, Whitaker (1997).
13. “Zeno Goes to Copenhagen: A Dilemma for Measurement-Collapse Interpretations of Quantum Mechanics” — Chalmers, McQueen (2022).
14. Comment on “Quantum Zeno effect” — L.E. Ballentine (1990).
15. [plato.stanford.edu — ‘Alan Turing’] — Andrew Hodges (2002).
16. “Can quantum computing solve classically unsolvable problems?” — Andrew Hodges (2005).
17. Leggett, A. J. (2001). “Probing quantum mechanics towards the everyday world: where do we stand”.
18. “Quantum theory cannot consistently describe the use of itself” — Frauchiger, Renner (2018).

--

--

Cogito (Magazine)

A student-run magazine engaged in investigating the interfaces of Modern Science and Philosophy.