Arthur Eddington once famously wrote: “If your theory is found to be against the second law of thermodynamics, […] there is nothing for it but to collapse in deepest humiliation.” So why not take the second law as an axiom, on which physical theories are built, rather than as a consequence to be tested? The problem is that, in its conventional thermodynamic formulation, the second law relies on quite a lot of physics to have been already introduced, and it is not clear how one could assume its validity before other concepts like “work” or “heat” are even defined.
In a paper published today on Physical Review Research, we identify in von Neumann’s information engine the conceptual device that allows us to discuss the second law already from the early stages of the construction of a physical theory, when its most fundamental logical structures are being laid down. What we find is that, by concatenating two information engines in a closed cycle, the second law can be thought of as the requirement that no information can be created from nothing, thus guaranteeing the internal logical consistency of the theory.
It is now widely accepted that Maxwell’s demon does not, in fact, break the second law of thermodynamics, if the energetic cost of resetting its memory is taken into account. This resolution of the paradox is known as Landauer’s principle.
In this work, we delve into the inner workings of Maxwell’s demon by considering how it behaves in the quantum error correction setting. We show that Landauer’s principle is only the limiting case of a more general triple trade-off relation between the thermodynamic, information-theoretic, and logic performances of the demon. For example, we show that for most measurements that the demon can perform, extracting work above the Carnot limit is penalized by a drop in the error correction fidelity. Moreover, when the demon successfully performs perfect error correction, work extraction above the Carnot limit becomes impossible with most quantum measurements. Finally, we realize that the amount of information that the demon can extract about the error type is bounded from above by the dissipated heat during that process. Interestingly, this also gives physical meaning to negative values of this information gain.
Position title and duration: postdoc or research assistant professor, depending on the candidate’s qualifications. The appointment is offered initially for a one-year period, with the possibility of further extensions up to three years of total duration.
Description: this position is offered by the Quantum Information Theory Group within the collaboration “Extreme Universe”, headed by Prof. Tadashi Takayanagi (YITP, Kyoto University). This collaboration brings together Japan-based world-renowned researchers in quantum information theory, quantum gravity, cosmology, and condensed matter physics, with the aim of creating new and exciting bridges between these very active areas of research. The successful candidate will work in close contact with Prof. Francesco Buscemi (Nagoya University), but is expected to interact also with the rest of the collaboration group and to participate in various interdisciplinary meetings within the project. The successful candidate is expected to commence their appointment on April 2022 or as soon as possible after that.
Requirements: applicants are expected to hold a PhD degree in a theoretical field related to quantum information sciences by the time they begin their appointment. Ideally, they will be familiar with recent ideas and techniques in quantum information theory and have at the same time strong interests in fundamental questions in theoretical physics.
Submission procedure: interested candidates should provide
a cover letter;
an up-to-date curriculum vitae;
a research statement;
an up-to-date list of research achievements (including published papers, preprints, talks, posters, etc.);
contact information of three references able to provide recommendation letters upon request.
Is there an observer-independent time-reversal in physics? Only for Hamiltonian dynamics, we argue. Otherwise there’s only retrodiction, which of course depends on the retrodictor’s (prior) believes. The paper (co-authored with Clive C. Aw and Valerio Scarani) is freely available from AVS Quantum Science website, where it is highlighted as a Featured Article. Below is a talk I recently gave on these ideas.
I was recently invited to give an overview talk on Petz’s “transpose” or “recovery” map at the workshop Workshop on Quantum Information and Quantum Black Holes, organized by Norihiro Iizuka (Osaka) and Tomonori Ugajin (Kyoto). I’ve put together what I came to learn about the properties of Petz’s transpose map, its uses is various areas of information theory and physics, and (most importantly!) its conceptual meaning. Slides are available here. Mark Wilde’s textbook contains a chapter entirely devoted to the technical aspects of Petz’s map and the theory of approximate recoverability.
“The law that entropy always increases holds, I think, the supreme position among the laws of Nature. If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equations – then so much the worse for Maxwell’s equations. If it is found to be contradicted by observation—well, these experimentalists do bungle things sometimes. But if your theory is found to be against the Second Law of Thermodynamics I can give you no hope; there is nothing for it to collapse in deepest humiliation.”
― Arthur Eddington, The Nature of the Physical World, Chap. 4
But why is that so? Why is the Second Law so “special” among the other laws of physics?
Simply because—as we argue in a paper recently published on Physical Review E and freely available on the arXiv—the Second Law is not so much about physics, as it is about logic and consistent reasoning. More precisely, we argue that the Second Law can be seen as the shadow of a deeper asymmetry that exists in statistical inference between prediction and retrodiction, and ultimately imposed by the consistency of the Bayes–Laplace Rule.
A little bit of background. In the past two decades, thermodynamics has undergone unprecedented progresses. These can be traced back to the developments of stochastic thermodynamics, on the one hand, and the theory of nonequilibrium fluctuations, on the other. The latter, in particular, has shown that the Second Law emerges from a more fundamental “balance relation” between a physical process and its reverse. According to such a balance relation, for example, scrambled eggs are not forbidden to unscramble spontaneously—instead, the probability of such a process is just extremely tiny, compared with that of its more familiar reverse. In turn, entropy—i.e. the thing that “no one knows what it really is”, according to the apocryphal exchange between Shannon and von Neumann—precisely is a measure of such a disparity.
In this paper we go one step further and show that the existence of a disparity is not due to some kind of “physical propensity” that irreversible processes have for unfolding in one direction more likely than in the opposite direction—an explanation that would lead to a circular argument—, but to the intrinsic asymmetry that exists between prediction and retrodiction in inferential logic. We thus conclude that the foundations of the Second Law are not to be found within physics, but one step below, at the level of logic.
A nice little piece written by CQT/NUS outreach is also available here.
One month ago I gave a colloquium at the 13th Annual Symposium of the Centre for Quantum Technologies (CQT) in Singapore. I decided to speak about my recent work on the role of Bayesian retrodiction in statistical mechanics — more precisely, in the conceptual foundations underlying fluctuation relations and the second law of thermodynamics.