Lecture 1: More perfect than we imagined: A physicist’s view of life (4 Jan 2016, 4:00 PM)
(for a general audience)
Sitting in a quiet room, we can hear sounds that cause our eardrums to vibrate by less than the diameter of an atom. When bacteria have to decide if they are swimming in the right direction to find more food, they count every single molecule that arrives at their surface. In these examples, and many more, evolution has selected for mechanisms that operate near the limits of what is allowed by the laws of physics. This lecture will give a tour of these beautiful phenomena, from microscopic events inside a developing embryo to our own perception and decision making. While there are many ways to build a biological system that might work, there are many fewer ways to build one that can approach the physical limits. Perhaps, out of its complexity, life emerges as simpler, and more perfect, than we imagined.
Lecture 2: Statistical mechanics for real biological networks (5 Jan 2016, 4:00 PM)
Many of life’s most fascinating phenomena emerge from interactions among many elements – many amino acids determine the structure of a single protein, many genes determine the fate of a cell, many neurons are involved in shaping our thoughts and memories. Physicists have long hoped that these collective behaviors could be described using the ideas and methods of statistical mechanics. In the past few years, new, larger scale experiments have made it possible to construct statistical mechanics models of biological systems directly from real data. I’ll describe the surprising successes of this “inverse” approach, using examples form families of proteins, networks of neurons, and flocks of birds. Remarkably, in all these cases the models that emerge from the data are poised at a very special point in their parameter space – a critical point. This suggests there may be some deeper theoretical principle behind the behavior of these diverse systems.
Lecture 3: Optimization principles and information flow in biological networks (6 Jan 2016, 4:00 PM)
Life depends as much on the flow of information as on the flow of energy. Given finite physical resources, there are limits on information transmission, and hence getting more information can be costly. Many biological systems seem to operate in a regime where these costs are significant, and so there is pressure to encode, transmit, and process information efficiently. I’ll present examples of these ideas, drawn both form neural systems and from developmental biology. We’ll see that there is direct evidence for the optimization of information transmission, and that when we promote this optimization to a theoretical principle we can derive behaviors of these networks that agree with experiment. I’ll also explore some of the many open questions in this line of research, most importantly: how do we define the information that is “relevant” for the organism?
This lecture is part of Information processing in biological systems