…alias observational entropy!
In his 1932 book, von Neumann famously discusses a thermodynamic device, similar to Szilard’s engine, through which he is able to compute the entropy of a quantum state, obtaining the formula . I discussed about this in another post.
Probably less people know however that, just a few pages after having derived his eponymous formula, von Neumann writes:
Although our entropy expression, as we saw, is completely analogous to the classical entropy, it is still surprising that it is invariant in the normal (note added, in the sense of “unitary/Hamiltonian”) evolution in time of the system, and only increases with measurements — in the classical theory (where the measurements in general played no role) it increased as a rule even with the ordinary mechanical evolution in time of the system. It is therefore necessary to clear up this apparently paradoxical situation.
What von Neumann is referring to in the above passage is the phenomenon of free or “Joule” expansion, in which a gas, initially contained in a small volume is allowed to expand against the vacuum, thereby doing no work, but causing a net increase in the entropy of the universe, even though its evolution is Hamiltonian, i.e., reversible, all along.
In order to resolve this issue, von Neumann suggests that the correct quantity to consider in thermodynamic situations is not (what we call it today) von Neumann entropy, but another quantity, that he calls macroscopic entropy (and which today is called observational entropy):
where denotes a fixed measurement, i.e., a POVM
,
is the expected probability of occurrence of each outcome, and
are “volume” terms. The measurement, with respect to which the above quantity is computed, is used by von Neumann to represent, in a mathematically manageable way, a macroscopic observer “looking at” the system: the system’s state is
, but the observer only possesses some coarse-grained information about it, and the amount of such coarse-grained information is measured by the macroscopic entropy.
In a paper written in collaboration with Dom Safranek and Joe Schindler and recently published on the New Journal of Physics, we delve into the mathematical properties and operational meaning of observational/macroscopic entropy, and discover some deep connections with the theory of approximate recovery and statistical retrodiction, which is a topic that keeps showing up in my recent works, even if I’m not looking for it from the start.
Is it perhaps because retrodiction really does play a central role in science? Or is it just me seeing retrodictive reasoning everywhere?