What is complexity? Way back in 1989, SFI hosted a workshop that framed how scientists have thought about this question ever since – but it didn’t come to a definitive answer, according to SFI External Professor Jim Crutchfield.

“One keeps hearing that there are so many measures of complexity,” Jim says. “People seem to have the idea that it’s just all such a mess, but it’s not!”

In fact, he says, there are now only a couple of key definitions of complexity, and they’re closely related: one from information theory, which says that a complex system is one whose key features can’t be easily described, and one from computational complexity, which says that a complex system is one that can’t be easily simulated.

Although progress has been made in developing theories relating to complexity, their application hasn’t been as convincing. Jim and SFI External Professor Jon Machta believe that a deepening of our understanding of complexity is key to more applications, like circumventing the coming failure of Moore’s laws for the development of technology, understanding the biological basis of computation in the brain, and understanding how patterns spontaneously emerge.

So the two are sponsoring a workshop, “Randomness, Structure, and Causality,” in January, both to take stock of the current understanding of complexity and to develop it in new domains for the future.

“A new level of rigor has emerged over the last two decades in the science of complex systems,” Jim says. “The time is right for the practically motivated measures of the experimentalists and the fundamental principles of the theoreticians to come together and enrich one another.”