D. Smith

Paper #: 06-03-011

An idea from computer science known as Landauer's principle provides a useful way to think about relations between metabolic energy and the information in biological systems. Landauer's principle relates the Shannon entropy in data streams to the thermal entropy of heat flows, and asserts that a reduction in the entropy in a data stream (as by a computation) can be accomplished only with the rejection of at least as much entropy to the environment, in the form of heat produced from work performed by the computer. Entropy reduction from the creation of ordered chemical ensembles by metabolism, structured by natural selection, corresponds to data entropy reduction in computer science. A formally identical Landauer principle relates the metabolically-instilled chemical information to heat rejected to the environment, and to the chemical work required to be performed by cells. The Landauer bound is attained when metabolism operates reversibly, an idealization that in real cells is limited by the ability of enzymes to create an infinite separation of timescales between their catalyzed and uncatalyzed reactions. Metabolism in the reversible limit can be organized around cycles isomorphic to the Carnot cycle for heat engines and refrigerators. Particles transported across different chemical potentials by metabolism take the place of entropy transported across different temperatures by a Carnot engine or refrigerator. Chemical ``Carnot cycles'' are the elementary computational steps of metabolism, and when their rates are structured by natural selection, Landauer's principle relates the total work consumed to the Kullback-Leibler divergence produced in the chemical system from its Gibbsian thermodynamic equilibrium. If this argument is extended to include simple models of dissipation, a relation is obtained between the structure of the selective fitness function, the information maintained in biomass, and the metabolic work required to maintain it against dissipative degradation.

PDF