An SFI-authored paper on “Nonlinear Information Bottleneck” is one of four papers to win the 2021 Entropy Best Paper Awards, having competed against every other article published by the journal in 2019.

In the paper, SFI Program Postdoctoral Fellow Artemy Kolchinsky, then-Postdoctoral Fellow Brendan Tracey (now at DeepMind), and Professor David Wolpert explored what would happen if artificial neural networks were explicitly trained to discard useless information, as well as making use of a novel estimator developed by Kolchinsky and Tracey.

Said Kolchinsky in 2019, “It may be that deep learning networks succeed because of what they learn to ignore, not just what they learn to predict. So we ask: what happens if we explicitly encourage a network to forget irrelevant information?”

Tracey explained that the motivation for the paper was to “make predictions using data from a bandwidth-limited environment,” such as “a satellite in space, or a remote weather station in Antarctica. You can’t send back all of the data you collect, so which pieces of data are the right data to transmit?”

The selection committee, chaired by Entropy Editor-in-Chief Kevin H. Knuth, judged papers in the categories of scientific rigor, significance, and originality. With the award comes a cash prize of 500 Swiss francs, as well as the opportunity to publish an additional paper open-access in Entropy free of charge.

Read the paper, "Nonlinear Information Bottleneck," in Entropy (November 30, 2019)

Read the official announcement of the 2021 Best Paper Awards