Read Good Calories, Bad Calories Online
Authors: Gary Taubes
evolutionary history. Those same genes would lead to obesity and diabetes in an environment in which food was plentiful.
“If the considerable frequency of the disease is of relatively long duration in the history of our species,” Neel had asked to begin his discussion, “how can this be accounted for in the face of the obvious and strong genetic selection against the condition? If, on the other hand, this frequency is a relatively recent phenomenon, what changes in the environment are responsible for the increase?”
The thrifty gene could be the answer only if diabetes was of long duration in the species—and there is no evidence of that. The disease seems to appear only after populations have access to sugar and other refined carbohydrates. In the Pima, diabetes appeared to be “a relatively recent phenomenon,” as Neel himself later noted. When Russel and Hrdli ka discussed the health of the Pima in the early 1900s, they made no mention of diabetes, even while noting the presence of such “rare” diseases as lupus, epilepsy, and elephantiasis.*71 As late as 1940, when El iott Joslin reviewed the medical records of the hospitals and physicians in Arizona, he concluded that the prevalence of diabetes was no higher among the Pima and other local tribes than anywhere else in the United States. Only in the 1950s, in studies from the Bureau of Indian Affairs, was there compel ing reason to believe that diabetes had become common. When Neel tested adolescent Yanomamo for the condition known as glucose intolerance, which might indicate a predisposition to diabetes, he found none, so had no reason to believe that diabetes existed before such isolated populations began eating Western foods. The same was true of an isolated tribe of Pima, discovered living in the Sierra Madre Mountains of northern Mexico. “The high frequency of [Type 2 diabetes] in reservation Amerindians,” Neel later explained, “must predominantly reflect lifestyle changes.”
By 1982, Neel had come to side with Peter Cleave in believing that the most likely explanation for the high rates of obesity and diabetes in populations like the Pima that had only recently become Westernized was their opportunity to “overindulge in high sugar content foods.”
This left open the question of what biological factors or genes might determine who got obese and diabetic and who didn’t in the presence of such foods, but it eliminated any reason to suggest that thrifty genes had ever bestowed some evolutionary advantage. “The data on which that (rather soft) hypothesis was based has now largely col apsed,” Neel observed. He now suggested that either a tendency for the pancreas to oversecrete insulin and so cause hyperinsulinemia, or a tendency toward insulin resistance, which in turn would result in hyperinsulinemia, was the problem, which is consistent with the carbohydrate hypothesis of chronic disease. Both of these, Neel suggested, would be triggered by the “composition of the diet, and more specifical y the use of highly refined carbohydrates.”
It wasn’t until the late 1970s, just a few years before Neel himself publicly rejected his hypothesis, that obesity researchers began invoking thrifty genes as the reason why putting on weight seems so much easier than losing it. Jules Hirsch of Rockefel er University was among the first to do so, and his logic is noteworthy, because his primary goal was to establish that humans, like every other species of animal, had apparently evolved a homeostatic system to regulate weight, and one that would do so successful y against fluctuations in food availability. We eat during the day, and yet have to supply nutrients to our cel s al night long, while we sleep, for example, so we must have evolved a fuel storage system that takes this into account. “To me, it would be most unthinkable if we did not have a complex, integrated system to assure that a fraction of what we eat is put aside and stored,” Hirsch wrote in 1977. To explain why these components might cause obesity so often in modern societies, he assumed as fact something that Neel had never considered more than speculation. “The biggest segment of man’s history is covered by times when food was scarce and was acquired in unpredictable amounts and by dint of tremendous caloric expenditure,” Hirsch suggested. “The long history of food scarcity and its persistence in much of the world could not have gone unnoticed by such an adaptive organism as man. Hoarding and caloric miserliness are built into our fabric.”
This was one of the first public statements of the notion that would evolve into the kind of unconditional proclamation made by Kel y Brownel a quarter century later, that the human body is an “exquisitely efficient calorie conservation machine.” But it depended now on an assumption about human evolution that was contradicted by the anthropologic evidence itself—that human history was dominated by what Jared Diamond had cal ed the “conditions of unpredictably alternating feast and famine that characterized the traditional human lifestyle.” Reasonable as this may seem, we have no evidence that food was ever any harder to come by for humans than for any other organisms on the planet, at least not until our ancestors began radical y reshaping their environment ten thousand years ago, with the invention of agriculture.
Both the anthropological remains and the eyewitness testimony of early European explorers suggest that much of the planet, prior to the last century or two, was a “paradise for hunting,” in the words of the Emory University anthropologist Melvin Konner and his col aborators, with a diversity of game, both large and smal , “present in almost unimaginable numbers.”*72 Though famines have certainly been documented among hunter-gatherer populations more recently, there’s little reason to believe that this happened prior to the industrial revolution. Those isolated populations that managed to survive as hunter-gatherers wel into the twentieth century, as the anthropologist Mark Nathan Cohen has written, were “conspicuously wel -nourished in qualitative terms and at least adequately nourished in quantitative terms.”
Hunter-gatherers lived in equilibrium with their environment just as every other species does. The oft-cited example is the !Kung Bushmen of the semi-arid Kalahari desert, who were studied by Richard Lee of the University of Toronto and a team of anthropologists in the mid-1960s. Their observations, Lee noted, were made during “the third year of one of the most severe droughts in South Africa’s history.” The United Nations had instituted a famine-relief program for the local agriculturalists and pastoralists, and yet the Bushmen stil survived easily on “some relatively abundant high-quality foods,” and they did not “have to walk very far or work very hard to get them.” The !Kung women would gather enough food in one day to feed their families for the next three, Lee and his col eagues reported; they would spend the remaining time resting, visiting, or entertaining visitors from other camps.
The prevailing opinion among anthropologists, not to be confused with that of nutritionists and public-health authorities, is that hunting and gathering al ow for such a varied and extensive diet, including not just roots and berries but large and smal game, insects, scavenged meat (often eaten at “levels of decay that would horrify a European”), and even occasional y other humans, that the likelihood of the simultaneous failure of al nutritional resources is vanishingly smal . When hunting failed, these populations could stil rely on foraging of plant food and insects, and when gathering failed “during long-continued drought,” as the missionary explorer David Livingstone noted of a South African tribe in the mid-nineteenth century, they could relocate to the local water holes, where “very great numbers of the large game” also congregated by necessity. This resiliency of hunting and gathering is now thought to explain why it survived for two mil ion years before giving way to agriculture. In those areas where human remains span the transition from hunter-gatherer societies to farmers, anthropologists have reported that both nutrition and health declined, rather than improved, with the adoption of agriculture. (It was this observation that led Jared Diamond to describe agriculture as “the worst mistake in the history of the human race.”) Although famines were both common and severe in Europe until the nineteenth century, this would suggest that those with European ancestry should be the most likely to have thrifty genes, and the most susceptible to obesity and diabetes in our modern toxic environments. Rather, among Europeans there is “a uniquely low occurrence of Type 2 diabetes,” as Diamond puts it, more evidence that the thrifty-gene hypothesis is incorrect.
Species adapt to their environment over successive generations. Those that don’t, die off. When food is abundant, species multiply; they don’t get obese and diabetic.
When earlier generations of obesity researchers discussed the storage of fat in humans and animals, they assumed that avoiding excessive fat is as important to the survival of any species as avoiding starvation. Since the average 150-pound man with a body fat percentage of only 10 percent is stil carrying enough fat calories to survive one month or more of total starvation, it seems superfluous to carry around more if it might have negative consequences. “Survival of the species must have depended many times both on the ability to store adequate yet not excessive amounts of energy in the form of fat [my italics], and on the ability of being able to mobilize these stores always at a sufficient rate to meet the body’s needs,” observed George Cahil and Albert Renold, considered two of the leading authorities on the regulation of fat metabolism, in 1965. The total amount of fat stored, they suggested, “should be kept sufficiently large to al ow for periods of fasting to which a given species in a given environment is customarily exposed, yet sufficiently smal to preserve maximum mobility.”
The thrifty-gene hypothesis, on the other hand, implies that we (at least some of us) are evolutionarily adapted to survive extreme periods of famine, but assigns to humans the unique concession of having evolved in an environment in which excess fat accumulation would not be a burden or lead to untimely death—by inhibiting our ability to escape from predators or enemies, for instance, or our ability to hunt or perhaps even gather. It presupposes that we remain lean, or at least some of us do, only as long as we remain hungry or simply lack sufficient food to indulge our evolutionary drive to get fat—an explanation for leanness that the British metabolism researchers Nancy Rothwel and Michael Stock described in 1981 as “facile and unlikely,” a kind way of putting it. The “major objection” to the thrifty-genotype hypothesis, noted Rothwel and Stock, “must be based on the observation that most wild animals are in fact very lean” and that this leanness persists “even when adequate food is supplied,” just as we’ve seen in hunter-gatherers. If the thrifty-gene hypothesis were true of any species, it would suggest that al we had to do was put them in a cage with plentiful food available and they would fatten up and become diabetic, and this is simply not the case.
Proponents of the thrifty-gene hypothesis, however, wil invoke a single laboratory model—the Israeli sand rat—to support the notion that at least some wild animals wil get fat and diabetic if caged with sufficient food. “When this animal is removed from the sparse diet of its natural environment and given an abundant, high-calorie diet,” wrote Australian diabetologist Paul Zimmet in a 2001 article in Nature, “it develops al of the components of the metabolic syndrome, including diabetes and obesity.”
But the sand-rat experiments themselves, carried out in the early 1960s at Duke University by the comparative physiologist Knut Schmidt-Nielsen, suggested that the abundance of food was not the relevant factor. Schmidt-Nielsen was trying to establish what aspect of the laboratory diet might be responsible for the obesity and diabetes that appeared in his sand rats. He had taken two groups of rats freshly trapped in Egypt and raised one on Purina Laboratory Chow—“49.4% digestible carbohydrates, 23.4% protein and 3.8% fat”—supplemented with “fresh mixed vegetables,” and the other on the fresh vegetables alone. Both had access to as much food as they desired, but only the chow-eating rats got diabetic and obese. This suggested that something about Purina Chow was the determining factor. Perhaps the rats liked it better than vegetables, and so they ate more, although that, too, could be a physiological effect related to the nutrient composition of the chow. It might have been the density of calories in the rat chow, which have less water content than vegetables and so more calories per gram. It was also possible that the cause of the diabetes and obesity in these rats, as Schmidt-Nielsen suggested, was “a carbohydrate intake that is greater than that occuring in the natural diet.”*73
Depending on the researchers’ preconceptions, the Israeli sand rats could have been considered an animal model of the carbohydrate hypothesis, rather than the thrifty-gene hypothesis. Monkeys in captivity, by the way, wil also get obese and diabetic on high-carbohydrate chow diets. One of the first reports of this phenomenon was in 1965, by John Brobeck of Yale, whose rhesus monkeys got fat and mildly diabetic on Purina Monkey Chow—15
percent protein, 6 percent fat, and 59 percent digestible carbohydrates. According to Barbara Hansen, who studies diabetes and obesity and runs a primate-research laboratory at the University of Maryland, perhaps 60 percent of middle-aged monkeys in captivity are obese by monkey standards. “This is on the kind of diet recommended by the American Heart Association,” she says, “high-fiber, low-fat, no-cholesterol chow.”