Overview
We want to understand the fundamental limits on the energetic cost of computation
The thermodynamic restrictions on all systems that perform computation provide major challenges to modern design of computers. For example, at present ~5% of US energy consumption is used to run computers. Similarly, ~50% of the lifetime budget of a modern high-performance computing center is to pay the energy bill. As another example, Google has placed some of their data servers next to rivers in order to use the water in the river to remove the heat they generate. As a final example, one of the major challenges facing current efforts to build exascale computers is how to keep them from melting.
The thermodynamic costs of performing computation also play a major role in biological computation. For example, ~ 20% of the calories burned by a human are used by their brain — which does nothing but compute. This is a massive reproductive fitness penalty. Indeed, one of the leading contenders for what development in hominin evolution allowed us to develop into homo sapiens is the discovery of fire, since that allowed us to cook meat and tubers, and thereby for the first time release enough calories from our food sources to power our brains. Despite this huge fitness cost though, no current evolution theory accounts for the thermodynamic consequences of computation. It also seems that minimizing the thermodynamic costs of computation has been a major factor in evolution’s selection of the chemical networks in all biological organisms, not just hominin brains. In particular, simply dividing the rate of computation in the entire biosphere by the rate of free energy incident on the earth in sunlight shows that the biosphere as a whole performs computation with a thermodynamic efficiency orders of magnitude greater than our current supercomputers.
In the 1960s and 1970s, Rolf Landauer, Charlie Bennett and collaborators performed the first, pioneering analysis of the fundamental thermodynamic costs involved in bit erasure, perhaps the simplest example of a computation. Unfortunately, their physics was semi-formal, initially with no equations at all. This is because when they did their work, nonequilibrium statistical physics was in its infancy, and so they simply did not have the tools for a formal analysis of the thermodynamics of computation.
Moreover, only a trivially small portion of computer science theory (CS) is concerned with the number of erasure operations needed to perform a given computation. At its core, much of CS is concerned with unavoidable resource / time tradeoffs in running computation. That is the basis of all computational complexity theory, many approaches to characterizing the algorithmic power of different kinds of computers, etc.
Fortunately, the last two decades have seen a revolution in nonequilibrium statistical physics. This has resulted in some extremely powerful tools for analyzing the fundamental thermodynamic properties of far-from-equilibrium dynamic systems – like computers. These tools have already clarified that there are unavoidable thermodynamic tradeoffs in computation, in addition to the resource/time tradeoffs of conventional CS theory. These thermodynamic tradeoffs relate the (physical) speed of a computation, the noise level, and whether the computational system is thermodynamically “tailored” for one specific environment or is general purpose, to name just a few. Interestingly, some of these tradeoffs are also addressed in modern computer engineering, for example in the techniques of approximate computing, adaptive slowing, etc. However, this work is being done in an ad hoc manner, driven entirely by phenomenology.
As a result, the time is ripe to pursue a new field of science and engineering: a modern thermodynamics of computation. This would combine the resource/time tradeoffs of concern in conventional CS with the thermodynamic tradeoffs in computation that are now being revealed. In this way we should be able to develop the tools necessary both for analyzing thermodynamic costs in biological systems and for engineering next-generation computers.