For many of the complex systems we care the most about, “simply relying on the fact that computers are bigger and faster isn’t enough,” says SFI Professor Cris Moore.

Intel co-founder Gordon Moore’s law about the doubling of computing power every few years is of little significance compared to the astronomically greater improvement in computing power achieved through smarter algorithms, he says.

An SFI workshop in August for physicists and computer scientists sought to prompt collaboration on new algorithms for solving problems and modeling nature.

A primary source of smarter algorithms is statistical physics. Physicists use their insights into a system’s structure to design algorithms and collect empirical evidence of their validity. Of particular interest are algorithms that “skip over the steps that nature takes,” as Cris Moore puts it, bypassing what’s called the “critical slowing down.” This phenomenon occurs in many natural systems near phase transitions: it takes water, for example, a long time to boil even though applying heat gets the water to the boiling temperature relatively quickly.

“So the challenge for theoretical computer scientists,” says Moore, “is to analyze the physicists’ algorithms rigorously, understand why they are so much faster than traditional algorithms – and if possible prove that they are – or delineate under which circumstances these algorithms work.”

During the workshop, that challenge made for an intense two-way flow of ideas between the physicists with their intuition and algorithmic experience and the computer scientists who would like to prove theorems about these smarter algorithms.

“We’re trying to strengthen the bridges between physicists and computer scientists and have them learn as much from one another as they can,” Moore says. “Participants walked out of here both with new algorithms to try and new conjectures to prove.”

More about the workshop