What is an individual? Information Theory may provide the answer

Despite the near-universal assumption of individuality in biology, there is little agreement about what individuals are and few rigorous quantitative methods for their identification. A new approach may solve the problem by defining individuals in terms of informational processes.

Read More

STAT: Misinformation is important public health data

In their op-ed for STAT, former SFI postdoctoral fellow Laurent Hébert-Dufresne (University of Vermont) and current postdoc Vicky Chuqiao Yang, Complexity Postdoctoral Fellow and Peters Hurst Scholar, argue that if scientists hope to develop better epidemiological models, they must grasp the complex interplay between social behavior and disease.

Read More

Decarbonizing the energy supply

Shifting from carbon-emitting energy sources to renewable ones will be an essential part of addressing climate change, but the path to a renewable power grid is uncharted. A February 26-28 working group explores how New Mexico might best approach the transition to renewable energy sources, and what lessons could be useful for other regions.

Read More

Wealth inequality and social network structure

An NSF-funded research project is exploring the effects of network structure on wealth inequality. In February over 40 anthropologists, economists, and others will review their research so far and chart new directions.

Read More

If cancer were easy, every cell would do it

A new Scientific Reports paper puts an evolutionary twist on a classic question. Instead of asking why we get cancer, Leonardo Oña of Osnabrück University and Michael Lachmann of the Santa Fe Institute use signaling theory to explore how our bodies have evolved to keep us from getting more cancer.  

Read More

Video: Copying vs. Transforming Information

New research by SFI Postdoctoral Fellow Artemy Kolchinsky and Bernat Corominas-Murtra presents an important distinction for information theory — copying vs. transforming. Watch the video explainer.

Read More

Learning by omission

What would happen if neural networks were explicitly trained to discard useless information, and how to tell them to do so, is the subject of recent research by SFI's Artemy Kolchinsky, Brendan Tracey, and David Wolpert.

Read More