Christopher Wren’s engraving of the lower side of the brain from Thomas Willis’s Cerebri Anatome, 1664

Brains are a lot like computers. Both transmit information, transform information (i.e., “solve problems”), store memories, and use circuits that require energy to achieve their functions. Unfortunately, unlike the computer you’re reading this on, we don’t have detailed schematics for the brain, and its “algorithms”— the step-by-step logical processes underlying its large-scale functions — are largely a black box. Simply put, we don’t know how the brain computes. In July, “Dynamics of the Off-Equilibrium Brain,” a virtual workshop led by SFI Professor David Wolpert and University of Pennsylvania Professor Vijay Balasubramanian, began to investigate new ways of understanding how the brain computes using newly developed ideas in thermodynamics and information theory.

Why thermodynamics and information theory? Brains must process tremendous amounts of information and do so while consuming resources like energy efficiently. “Your brain is amazingly more thermodynamically efficient than computers,” Wolpert says. Unlocking the relationship between energy and information in the brain could help researchers figure out how computation actually works. In particular, the workshop focused on the fact that the brain is not a static system, as classical theoretical frameworks often assume. Instead, it is an inherently dynamic system.

The workshop was held over two weeks with a daily talk and freeform discussions by Zoom. Attendees and speakers logged in from around the world to spend about three and a half hours each day talking about brains. Participants had backgrounds in neuroscience and physics, and spanned many career stages — their ages ranged from 18 to 80. Participants presented some new research, but dedicated much of their time to figuring out how to even formulate questions for future inquiry.

“Normal workshops and conferences just don't fit the bill — you come and report stuff you've already done,” says Balasubramanian. “I felt that this workshop was fundamentally different. The entire two weeks was about trying to figure out how to express new concepts that we didn't previously know how to formalize.”

One fundamental question discussed at the workshop: What exactly is computation in the brain? According to Balasubramanian, much of what classical neuroscience calls computation is actually communication — sending information from here to there. But as Wolpert says, “If all the brain did was transmit information accurately, you’d do well to scoop out all the goo in your skull and replace it with fiber optic”. Computation, on the other hand, requires transforming information.

That’s where non-equilibrium thermodynamics comes in. While all parts of the body spend energy synthesizing new proteins, the brain’s job is to compute. Determining constraints on the energy used to perform that computation could provide answers to questions such as why humans have as many neurons as we do — 86 billion, on average. Applying know-how from thermodynamics about noise and entropy, the measure of disorder in a system, could help formulate better theories about how the brain does its job.

Another goal of the workshop was to delve more explicitly into the brain-computer analogy. Neuroscientists talk a lot about circuits at various scales, but the discussions often fall short of explaining holistically how the brain functions as a computer. “We have circuit diagrams of which brain areas talk to which brain areas, and what's going on within each brain area—but I challenge you to find one person on the planet who can explain to you in detail how your behaviors arise from those divisions,” Balasubramanian says.  By framing questions with additional computer science concepts like “architectures” — the rules which bridge the gap between hardware and software — and “algorithms,” he and Wolpert hope to push inquiry further.

Part of the reason classical neuroscience simply hasn’t been asking some of these questions is because tools to measure the activities of neurons have been limited, and hence the inquiry wasn’t feasible. Only recently have experimentalists been able to take data from large numbers of neurons at the same time. It’s at these scales, above individual neurons and below entire regions of the brain, that the processes are murkiest, and where the workshop’s participants centered their discussions.

Following the workshop, attendees have split off into smaller working groups to tackle problems raised during the sessions. They hope to eventually produce theories of the brain that experimentalists can check against reality.

Read more about the workshop, "Dynamics of the Off-Equilibrium Brian: Information Processing and Energy" (July 7-16, 2021)

 

[By Daniel Garisto]