Heisenberg’s principle, Shannon’s information, and nuclear (research) reactors

The information-theoretic formulation of Heisenberg’s Uncertainty Principle that Michael J.W. Hall (Griffith U, Brisbane), Masanao Ozawa (Nagoya U), Mark M. Wilde (Louisiana State U, Baton Rouge), and I formulated a while ago, has been experimentally tested and verified by Georg Sulyok, Stephan Sponar, Demirel Bülent, and Yuji Hasegawa (all at the Vienna Atomistitut) using a very precise measurement on the neutrons emitted by their research nuclear reactor (a TRIGA Mark II). The results have been published in Physical Review Letters, as an Editors’ Suggestion (the paper is also freely available on the arXiv).

Heisenberg (left), the father of the uncertainty principle, meets Shannon (right), the father of information theory, above the core of a TRIGA research reactor. The blue glow is caused by Cherenkov radiation. (Source: Wikipedia)

Heisenberg’s Uncertainty Principles

Heisenberg’s Uncertainty Principle (HUP) is often summarized as the statement that any act of measurement inevitably causes uncontrollable disturbance on the measured system. Put in a more spectacular way, HUP would dictate that we can learn about the present, but at the cost of being unable to fully predict the future. In fact, Heisenberg, in his original paper, never claimed such a generally valid, all-encompassing statement. Instead, his intention was to construct a physically plausible (for the scientific community of that time, 1927) scenario, in which the mathematical property of non-commutativity of quantum observables would have measurable consequences.

We can learn about the present, but at the cost of being unable to fully predict the future.

I think it is fair to put Heisenberg’s original work into perspective: though rigorous (at least for the standards of that time), it without doubt relies on over-idealized measurement models, like the famous and much-debated gamma-ray microscope thought experiment for the measurement of the position of an electron by photon scattering. This, of course, can hardly lead to any statement of general validity, and I believe that neither Heisenberg nor his contemporaries would have thought otherwise.

How to tame the general case then? Starting from the axioms of quantum theory (those about states, observables, and the ‘Born rule’) and proceeding in a purely geometric way, Robertson derived, in 1929, a relation that is usually presented as the mathematical formulation of HUP, namely,

\Delta {A} \Delta {B} \ge \frac12|\left<\psi|(AB-BA)|\psi\right>|

where \Delta {A}^2:=\left<\psi|(A-\left<\psi|A|\psi\right>)^2|\psi\right> and \Delta {B}^2:=\left<\psi|(B-\left<\psi|B|\psi\right>)^2|\psi\right> are the mean-square deviations of the two observables in state \psi.

At this point, however, the orthodox textbook (like, for example, the still nowadays excellent Nielsen&Chuang) will rightly notice that Robertson’s relation has nothing to do with a noise-disturbance relation: \Delta{A} and \Delta{B} cannot be interpreted as measures of ‘accuracy with which A is measured’ and ‘disturbance caused on the value of observable B’ without soon running into some sort of nonsense. The correct interpretation is the following: suppose that we have a very large number of particles, all in the same state \psi, and that we measure A on half of them and B on the remaining half; we would then observe that the statistical data of the measurement outcomes would obey Robertson’s inequality. Since no mention is made of the state of the particles after the measurements, it is clear that Robertson’s relation is surely not about the disturbance caused by the act of measurement, but rather about the limitations that quantum theory poses on the preparation of quantum states, that cannot be simultaneously sharp with respect to two incompatible observables.

It hence seems clear to me that we are dealing with two uncertainty principles:

  1. static uncertainty principle, namely, Robertson’s inequality, and
  2. dynamical uncertainty principle, namely, a statement that should establish a tradeoff between the accuracy with which an observable (A) is measured and the disturbance consequently caused on another non-commuting observable (B).

Should one then give up with the search for a noise-disturbance relation à la Heisenberg, i.e., involving mean-square deviations and commutators? The answer is no: as Masanao Ozawa showed some time ago, with the careful definitions of a ‘noise operator’ and a ‘disturbance operator,’ it is indeed possible to generalize Robertson’s relation, turning it into a tradeoff relation between accuracy (with which one observable, A, is measured) and disturbance (that said measurement introduces in the other observable, B). There have been some (hot) debate on this particular approach, but this would take us too far.

The Information-Theoretic Formulation

Another static uncertainty principle is that discovered by Hans Maassen and Jos Uffink (in 1988), generalizing a proposal first made by David Deutsch (in 1983). Their relation looks like this:

H(A) + H(B) \ge k

where H(A) and H(B) denote the entropies of the statistical distributions of the outcomes of the measurements of A and B, and k=k(A,B) is a number that is strictly positive whenever A and B are incompatible. Whenever this is the case, the Deutsch-Maassen-Uffink relation prevents H(A) and H(B) from being both null at the same time.

The entropic formulation of the uncertainty principle has some features making it preferable, in some situations, to the usual formulation in terms of mean-square deviations. The two main reasons are the following:

  1. the lower bound k(A,B) does not depend on the state of the system being measured, while the lower bound in Robertson’s inequality becomes trivial whenever \psi is, for example, an eigenstate of either A or B;
  2. the entropies H(A) and H(B) do not depend on the numerical value of the possible outcomes (i.e., the eigenvalues of A and B) but only on their statistical distribution; on the contrary, the mean-square deviations, \Delta{A} and \Delta{B}do depend on the numerical value of the eigenvalues of the two observables (for example, a simple relabeling of outcomes can lead to very different values for the mean-square deviations).

Even though the entropic formulation of the uncertainty principle is quite different from the traditional one given in terms of mean-square deviations, it falls in the same category, in the sense that it only captures the ‘static’ part of Heisenberg’s principle. Indeed, the Deutsch-Maassen-Uffink relation refers to the outcome statistics collected from many, independent measurements of observables A and B on a very large number of particles all prepared in the same state, but the states of the particles after the measurements never enter the analysis.

Again, one may wonder whether it is possible to prove an entropic tradeoff relation that captures the dynamical uncertainty principle. Indeed it is possible to do so, and we did that in a recent collaboration. Our formula looks as follows:

N(\mathfrak{m},A)+ D(\mathfrak{m},B)\ge k

where

  1. N(\mathfrak{m},A) measures the noise with which the measuring apparatus \mathfrak{m} measures the observable A,
  2. D(\mathfrak{m},B) measures the disturbance that the measuring apparatus \mathfrak{m} causes on the value of the observable B, and
  3. k=k(A,B) is the same number appearing also in the Maassen-Uffink relation (it is hence strictly positive whenever A and B are incompatible).

In information-theoretic terms (“as Shannon would say“) the above relation essentially describes the tradeoff between knowledge about A and predictability of B. It thus proves the statement that I wrote at the beginning, namely:

we can learn about the present (the value of A), but at the cost of being unable to fully predict the future (the value of B).

Read the full paper on the arXiv: http://arxiv.org/abs/1504.04200

Note added: I was tempted to title this post “Heisenberg meets Shannon in a reactor core,” just to add an item to the list of people that have met Shannon in titles.

2 thoughts on “Heisenberg’s principle, Shannon’s information, and nuclear (research) reactors

Leave a comment