To understand complex phenomena — to untangle the ways multiple variables work on a system and identify the mechanisms underlying a behavior — scientists build statistical models. But first, they must decide which details to include in their models, and which to leave out. This level of fine- or coarse-graining differs among the disciplines and depends on the specific research question being addressed. For instance, a doctor can infer that smoking led to a patient’s lung cancer, and can choose a course of action accordingly without documenting every time the patient picked up a cigarette. However, research into, say, quantum physics requires much higher granularity. “When we face chemical, biological, or economic choices, information about the quantum-level details of the relevant systems may not be worth anything to us as agents. Therefore, this information can be left out of our explanatory models,” says David Kinney, who is completing his Ph.D. in the philosophy of science and formal epistemology at the London School of Economics. Kinney’s research also looks for better ways to graphically express probabilities and uncertainty in modeling high-dimensional systems, including interval-valued probabilities and algorithmic measures of complexity.
Kinney plans to join SFI as an Omidyar Fellow in September 2019.