Alain Houle, via authors

When we choose fruit from a grocery store shelf, we use sight, smell, and touch to pick the most appealing selections. We know that a too-firm, unripe pear won’t be as tasty, and to leave behind the over-ripe avocado that yields, mushy, to light pressure. The need to identify ripe fruit may have helped our primate ancestors develop fine motor skills in the first place, suggests a new paper published in Interface Focus, co-authored by SFI Omidyar Fellow alum Justin Yeakel.

Fig species are one of the critical, keystone food sources for some 1,200 vertebrate species in tropical forests around the world. In Kibale National Park, Uganda, figs are a fallback food source for Chimpanzees when more favored fruits are scarce. Figs bear fruit asynchronously, so one fig tree may have fruits on a continuum of ripeness at any given time. For about 27 percent of fig species worldwide, the fruits remain green even as they ripen. When feeding on green fig species, foraging chimpanzees must rely on senses other than sight to select their food.

To select the tastiest green figs — those with high fructose content, the authors’ analysis revealed — chimpanzees use a series of sensory assessments. Before smelling or biting into a green fruit, the chimpanzees used manual palpation, squeezing the fruit between thumb and forefinger, to assess desirability.

“The deliberate and methodical nature of the behavior is conspicuous to human observers in part because it is so familiar,” write the authors. Palpation is a quick way to determine ripeness — and thus caloric value — of fruit and may have put evolutionary selective pressure on our early ancestors to develop fine motor skills.

Were hands the first “tools” that primates used? It’s tempting to take that view, write the authors: “The advantage of this outlook is that it offers a fresh perspective on the evolution of skilled forelimb movements. Tool use is perhaps best viewed as the exaptation of a hand that was itself a tool for evaluating cryptic foods.”

Read the paper in the Royal Society's Interface Focus (April 22, 2016)