· Illustration by Paul Hostetler
A biological anthropologist has upended everything I thought I knew about calories.
If there’s one thing we should all understand, it’s calories. They’re listed on every package of Fritos, they’re the reason why grocery-store endcaps are stacked high with Coke Zero and Diet Pepsi, and they explain why so many of us are overweight.
Calories are always on our minds. We are constantly watching them, counting them, having a cheat-day with them, or otherwise interacting with them in ways that imply some level of understanding about them.
I’ve heard some conventional wisdom about counting calories challenged (especially in this magazine). But I’ve also noticed that for every person who buys whole foods because he or she has learned that the quality of calories matters more than the quantity, there are three more in the snack aisle buying 100-calorie packs of Cheez-It Snack Mix because they believe a calorie is just a unit of measurement, like inches or grams. Calories are not timeless, valueless, agnostic things like miles or degrees Fahrenheit. And not all calories are created equal.
I was reminded — in a rather roundabout way — of this unconventional wisdom recently while reading Catching Fire: How Cooking Made Us Human by biological anthropologist Richard Wrangham, PhD. He argues that the great leap forward that made us human, rather than great apes sleeping in trees, came when we began to cook our food.
Wrangham, now a professor at Harvard, was once an assistant to pioneering primate researcher Jane Goodall, and he spent years in the jungle studying chimps. At one point he tried to eat what they ate, living on jungle figs and leaves. He couldn’t do it. The figs were bitter and rock hard, and tree leaves were… well, not people food. He asked pygmies, who were native to the region, whether they could eat what the chimps ate. Never, they said. Not even in times of famine.
This evolutionary disconnect vexed Wrangham for years. We share many biological traits with our primate ancestors, and yet we could not survive on their raw-food diet. Conventional wisdom told him that 100 calories of tree leaves should deliver the same energy punch as 100 calories of roasted pig. But if that were the case, there would have been no evolutionary reason for humans to begin cooking food.
So Wrangham set out to explore the history of the calorie. What he discovered was a lot of outmoded thinking that needed an upgrade.
The concept of calories first surfaced in the late 18th century. The French chemist Antoine-Laurent Lavoisier theorized that there was an invisible source of energy responsible for creating heat, and he decided a derivative of the Latin term for heat, “caloric,” would describe that substance. Just as electricity was an invisible substance creating lightning, caloric would be an invisible substance creating heat.
To measure that substance, French scientist Pierre Eugène Marcellin Berthelot invented the bomb calorimeter in 1881: He put food in a chamber immersed in water, ignited the compressed oxygen in the chamber to set the food on fire, and measured how much it heated the water.
You might reasonably assume that calorie science has advanced since then, but you would be wrong. The USDA measures calories (actually kilocalories, or kcal) using a variation of the bomb-calorimeter system developed by Wilbur Atwater in 1896. The Atwater system is still based on average values derived by the burning of foods (4 kcal for each gram of protein, 4 kcal for carbohydrates, 9 kcal for fat, and 7 kcal for alcohol).
I can see why this made sense in 1896. Just as coal was burned to produce the energy that fueled locomotives and steamships, so it logically followed that food must be burned to produce the energy that fueled humans.
Wrangham suggests it’s not as logical as we may think. Instead of obsessing over calories, he argues, we should focus on the way the human body actually uses food.
Many types of food are used only partially when eaten raw. Chimpanzees can easily digest raw green bananas, raw potatoes, even raw eggs and meat because of the size of their intestines.
Hunter-gatherers living hundreds of thousands of years ago gradually figured out that cooking their food gave them a better chance of survival because it allowed them to get the most out of what they consumed. And eating cooked foods gradually reduced the size of the human stomach and large intestine, an evolutionary shift that eventually separated us from our chimp ancestors.
Wrangham’s thesis tells me that our prehistoric ancestors may have understood the nature of calories better than many of us do today. They figured out that their bodies were able to use some calories better than others. They were the first to discover that quality is more important than quantity.
This shouldn’t have been so surprising to me, actually. When babies are ready to eat their first solid foods, we give them finely pulverized cooked grains, because we know that is the easiest thing for them to digest. When people are recovering from illness, we give them long-cooked food suspended as small particles in liquid, also known as soup. We crave rice pudding and not raw rice, and toast rather than a handful of flour.
We’ve also been taught to crave what’s inside that package of Cheez-It Snack Mix because, after all, it’s only 100 calories. And until we understand the difference between 100 calories of processed foods and 100 calories of whole foods, we’ll continue to suffer the consequences of our obesity epidemic.
Regardless, when I sit down at the Thanksgiving table this year, I’m not going to be thinking about calories at all while I enjoy a plateful of whole foods my body can really use. And I’ll tell anyone who will listen about the biological anthropologist who taught me why we’d all be better off if we started doing the same.
Illustration by Paul Hostetler