Wolpert, David H.
One of the central concerns of computer science is how the resources needed to perform a given computation depend on that computation. Moreover, one of the major resource requirements of computers—ranging from biological cells to human brains to high-performance (engineered) computers—is the energy used to run them, i.e. the thermodynamic costs of running them. Those thermodynamic costs of performing a computation have been a long-standing focus of research in physics, going back (at least) to the early work of Landauer, in which he argued that the thermodynamic cost of erasing a bit in any physical system is at least . One of the most prominent aspects of computers is that they are inherently non-equilibrium systems. However, the research by Landauer and co-workers was done when non-equilibrium statistical physics was still in its infancy, requiring them to rely on equilibrium statistical physics. This limited the breadth of issues this early research could address, leading them to focus on the number of times a bit is erased during a computation—an issue having little bearing on the central concerns of computer science theorists. Since then there have been major breakthroughs in nonequilibrium statistical physics, leading in particular to the new subfield of 'stochastic thermodynamics'. These breakthroughs have allowed us to put the thermodynamic analysis of bit erasure on a fully formal (nonequilibrium) footing. They are also allowing us to investigate the myriad aspects of the relationship between statistical physics and computation, extending well beyond the issue of how much work is required to erase a bit. In this paper I review some of this recent work on the 'stochastic thermodynamics of computation'. After reviewing the salient parts of information theory, computer science theory, and stochastic thermodynamics, I summarize what has been learned about the entropic costs of performing a broad range of computations, extending from bit erasure to loop-free circuits to logically reversible circuits to information ratchets to Turing machines. These results reveal new, challenging engineering problems for how to design computers to have minimal thermodynamic costs. They also allow us to start to combine computer science theory and stochastic thermodynamics at a foundational level, thereby expanding both.