In the space inside a computer chip, where electricity becomes information, there’s a scientific frontier. The same frontier can be found inside a cell, where information instead takes the form of chemical concentrations. Recent breakthroughs in the field of nonequilibrium statistical physics have revealed vast areas of research lying hidden within the “thermodynamics of computation.” Advances in this field, which involves elements of statistical physics, computer science, cellular biology, and possibly even neurobiology, could have far-reaching consequences for how we understand, and engineer, our computers. To kick-start this line research, Santa Fe Institute scientists and their collaborators have launched an online wiki for collaboration. This week they also published a paper that neatly summarizes recent advances and open questions that pertain to thermodynamics and computation.
“The thermodynamic restrictions on all systems that perform computation provide major challenges to the modern design of computers,” the researchers write in the opening paragraph of the wiki, designed to “serve as a hub and a gathering place for all who are interested.” They go on to outline the magnitude of energy consumed by computers and the engineering challenges that result when a portion of that energy is lost as waste heat. The wiki also compares natural computations, performed by cells or human brains, to artificial computations, which are markedly less efficient.
The research picks up on work by Rolf Landauer, who in 1961 postulated that in order to erase a single bit of information — a 1 or a 0 — a certain amount of energy must be lost as heat. Landauer’s insight is well known to computer scientists and has led to an informal maxim to avoid bit erasure whenever possible.
Going beyond Landauer’s cost, the new paper tries to convey that “there’s more to the thermodynamics of computation than just bit erasure,” says co-author Joshua Grochow of the University of Colorado Boulder. The paper, published in the computer science newsletter SIGACT News, presents additional factors that could affect how energy flows in and out of atoms during a computation.
To reach other scientists who might be interested in pursuing a thermodynamics of computation, Grochow and co-author David Wolpert of the Santa Fe Institute catalog some of the new tools from statistical physics that apply to nonequilibrium systems — like computers.
“Part of what we are trying to do with this paper is package up the lessons from nonequilibrium statistical [physics] over the past 20 years in a way that makes clear what the new computational questions are,” Grochow explains. He hopes that by presenting what is now known about the relationship between thermodynamics and the microscopic processes that occur during computation, the paper will “entice computer scientists to work on a new generation of questions.”
One of these questions involves how to thermodynamically “tune” computers to the inputs they are most likely to encounter. Grochow gives the example of a calculator that is thermodynamically optimized for random 32-bit string inputs (equivalent to a decimal value of 10 digits). The majority of human users don’t enter inputs that require any of the higher bits. If the calculator were re-engineered to “expect” fewer than 32 bits, would it waste less energy in the form of heat?
Beyond the accuracy of a computation, Grochow says the amount of memory a computation requires and how long the computation takes are other aspects that could affect its thermodynamic efficiency.
Wolpert hopes their research will expand to incorporate other recent breakthroughs from statistical physics, like the Jarzynski equation. This equation provides a probabilistic bridge between the macro-scale world, where entropy can only increase, and the micro-scale world, where it does not. Some computer transistors are tiny enough to exist between these macro and micro scales.
“We’re extending computer science theory, which was originally motivated by real-world systems, to other aspects of those systems that it never even knew to think about before,” says Wolpert.
The theory could lead to engineering advances that would enable cooler, more powerful machines, like exascale computers and even tiny swarm robots. It could also impact the sustainability of computing technology.
“Computers now use a non-trivial fraction of energy in first world countries,” says Grochow. “Given that computing is going to continue to grow, reducing the energy they consume is hugely important to reducing our overall energy footprint.”
Read “Beyond number of bit erasures: New complexity questions raised by recently discovered thermodynamic costs of computation” in SIGACT News. (June 21, 2018)
Visit the Thermodynamics of Computation Wiki for reference materials, upcoming events, job openings, and discussion forum.