Scientists have long sought new insights about plants, animals, and entire ecosystems by viewing them as living computers: they take in energy from the sun to process information and solve complex problems – where to find food, how to avoid predators, or even (in the case of human animals) how to build computers.
During a workshop this week at SFI on statistical physics, information processing, and biology, researchers plan to take the first steps toward turning this intriguing but semiformal notion into a quantitative science.
“People have been saying forever that living systems are computational systems,” says workshop co-organizer and SFI Professor David Wolpert. But the idea hasn’t gone beyond a few, high-level papers, in large part because until recently physics didn’t have the tools to analyze living things as computers.
The problem is a subtle one. To study computation and information processing, researchers often turn to statistical mechanics, a theory originally built to understand the microscopic processes underlying, well, steam engines.
But traditional statistical mechanics doesn’t really apply to biology – it’s an equilibrium theory, and biology is anything but an equilibrium phenomenon, notes Wolpert. Indeed, the main thing plants and animals do is disturb Nature’s equilibrium by taking in food and sunlight and turning it into ever more complex biological structures.
Fortunately, Wolpert says, there’s been an explosion in the last two decades in non-equilibrium statistical physics – meaning researchers interested in biological computation might finally have the tools they need to develop their ideas quantitatively.
With that in mind, Wolpert and members of SFI’s resident and external faculty will bring together a multidisciplinary group of experts to begin building what Wolpert describes as a “completely new science” that draws heavily from physics, biology, and computer science.
To narrow down their challenge some, the group will focus on three questions: how much energy does biological computation require, how much energy does the evolution of biological computers require, and how has the fraction of the energy in sunlight that the entire biosphere actually uses changed over time.
Although still a nascent exploration, Wolpert says, a “thermodynamics of biological computation” could have wide-ranging implications across other sciences. It could, for example, aid computer scientists hoping to develop more efficient supercomputers, or help biologists understand how the human brain – a computer with enormous energy requirements – could have evolved.
More about the workshop here