The implications of Oka’s experiment are clear. If cooking softens food and softer food leads to greater energy gain, then humans should get more energy from cooked food than raw food not only because of processes such as gelatinization and denaturation, but also because it reduces the costs of digestion. This prediction has been studied in the Burmese python. Physiological ecologist Stephen Secor finds pythons to be superb experimental subjects because after swallowing a meal, the snakes lie in a cage doing little but digesting and breathing. By measuring how much oxygen the pythons consume before and after a meal, Secor measures precisely how much energy the snakes use, and can attribute it to the cost of digestion. He typically monitors the snakes for at least two weeks at a time.
Secor and his team have shown repeatedly that the physical structure of a python’s diet influences its cost of digestion. If the snake eats an intact rat, its metabolic rate increases more than if a similar rat is ground up before the snake eats it. Amphibians yield the same results. Toads given hard-bodied worms have higher costs of digestion than those eating soft-bodied worms. Just as Oka’s team found with rats eating softer pellets, Secor’s studies show that softer meat is also digested with less energy expenditure.
A particular advantage of the Burmese pythons is that experimenters can insert food directly into their esophagus. The snakes show no signs of objecting. No matter whether the pythons find a food appealing and regardless of how easy the food is to swallow, the pythons just digest what they are given. They are an ideal species in which to test the effects of cooking on the cost of digestion. I approached Secor in 2005 to ask if he would be interested in the following study. Secor assigned eight snakes to the research, and his team prepared five kinds of experimental diet. Lean beef steak (eye of round, with less than 5 percent fat) was the basic food and was given to the snakes in each of four preparations: raw and intact; raw and ground; cooked and intact; and cooked and ground. The snakes were also given whole intact rats.
The experiment took several months. As expected from earlier results, the snakes’ cost of digestion when they ate the raw, intact meat was the same as for the whole rats. But grinding and cooking changed the costs of digestion. Grinding breaks up both muscle fibers and connective tissue, so it increases the surface area of the digestible parts of the meat. Ground meat is exposed more rapidly to acid, causing denaturation, as well as to proteolytic enzymes, causing degradation of the muscle proteins. Grinding reduced the snakes’ cost of digestion by 12.3 percent. Cooking produced almost identical results. Compared to the raw diet, cooked meat led to a reduction in the cost of digestion by 12.7 percent. The effects of the two experimental treatments, grinding and cooking, were almost entirely independent. Alone, each reduced the cost of digestion by just over 12 percent. Together, they reduced it by 23.4 percent.
Mrs. Beeton was right to cherish softness as an aid to digestion. It makes sense that we like foods that have been softened by cooking, just as we like them chopped up in a blender, ground in a mill, or pounded in a mortar. The unnaturally, atypically soft foods that compose the human diet have given our species an energetic edge, sparing us much of the hard work of digestion. Fire does a job our bodies would otherwise have to do. Eat a properly cooked steak, and your stomach will more quickly return to quiescence. From starch gelatinization to protein denaturation and the costs of digesting, absorbing, and assimilating meat, the same lesson emerges. Cooking gives calories.
When we consider the difficulties humans experience on raw diets, the evidence that all animals thrive on cooked food, and the nutritional evidence concerning gelatinization, denaturation, and tenderness, what is extraordinary about this simple claim is that it is new. Admittedly, cooking can have some negative effects. It leads to energy losses through dripping during the cooking process and by producing indigestible protein compounds, and it often leads to a reduction of vitamins. But compared to the energetic gains, those processes do not matter. Overall it appears that cooking consistently provides more energy, whether from plant or animal food.
Why then do we like cooked food today? The energy it provides is more than many of us need, but it was a critical contribution for our remote ancestors just as it is vital for many people living nowadays in poverty. Tens of thousands of generations of eating cooked food have strengthened our love for it. Consider foie gras, the liver of French geese that have been cruelly force-fed to make them especially fat. The fresh liver is soaked in milk, water, or port, marinated in Armagnac, port, or Madeira, seasoned, and finally baked. The result is so meltingly soft and tender that a single bite has been said to make a grown man cry. Our raw-food-eating ancestors never knew such joy.
Cooked food is better than raw food because life is mostly concerned with energy. So from an evolutionary perspective, if cooking causes a loss of vitamins or creates a few long-term toxic compounds, the effect is relatively unimportant compared to the impact of more calories. A female chimpanzee with a better diet gives birth more often and her offspring have better survival rates. In subsistence cultures, better-fed mothers have more and healthier children. In addition to more offspring, they have greater competitive ability, better survival, and longer lives. When our ancestors first obtained extra calories by cooking their food, they and their descendants passed on more genes than others of their species who ate raw. The result was a new evolutionary opportunity.
CHAPTER 4
When Cooking Began
“The introduction of cooking may well have been the decisive factor in leading man from a primarily animal existence into one that was more fully human.”
—CARLETON S. COON,
The History of Man
A
rchaeologists are divided about the origins of cooking. Some suggest that fire was not regularly used for cooking until the Upper Paleolithic, about forty thousand years ago, a time when people were so modern that they were creating cave art. Others favor much earlier times, half a million years ago or before. A common proposal lies between those extremes, advocated especially by physical anthropologist Loring Brace, who has long noted that people definitely controlled fire by two hundred thousand years ago and argues that cooking started around the same time. As the wide range of views shows, the archaeological evidence is not definitive. Archaeology offers only one safe conclusion: it does not tell us what we want to know. But though we cannot solve the problem of when cooking began by relying on the faint traces of ancient fires, we can use biology instead. In the teeth and bones of our ancestors we find indirect evidence of changes in diet and the way it was processed.
Yet the archaeological data leave no doubt that controlling fire is an ancient tradition. In the most recent quarter of a million years, there is sparkling evidence of fire control, and even occasionally of cooking, by both our ancestors and our close relatives the Neanderthals. The most informative sites tend to be airy caves or rock shelters, many of them in Europe. In Abri Pataud in France’s Dordogne region, heat-cracked river cobblestones from the late Aurignacian period, around forty thousand years ago, show that people boiled water by dropping hot rocks in it. At Abri Romani near Barcelona, a series of occupations dating back seventy-six thousand years includes more than sixty hearths together with abundant charcoal, burnt bones, and casts of wooden objects possibly used for cooking. More than ninety-three thousand years ago in Vanguard Cave, Gibraltar, three separate episodes of burning can be distinguished in a single hearth. Neanderthals heated pinecones on these fires and broke them open with stones, much as contemporary hunter-gatherers have been recorded doing, to eat the seeds.
Our ancestors were using fire in the Middle East and Africa as well. In a cave at Klasies River Mouth, a coastal site in South Africa from sixty thousand to ninety thousand years ago, burnt shells and fish bones lie near family-size hearths that appear to have been used for weeks or months at a time. Between 109,000 and 127,000 years ago in the Sodmein Cave of Egypt’s Red Sea Mountains, people appear responsible for huge fires with three distinct superimposed ash layers and the burnt bones of an elephant. Charred logs, together with charcoal, reddened areas, and carbonized grass stems and plants, date to 180,000 years ago at Kalambo Falls in Zambia. Back to 250,000 years ago in Israel’s Hayonim Cave, there are abundant hearths with ash deposits up to 4 centimeters (1.6 inches) thick. Such sites show that people have been controlling fire throughout the evolutionary life span of our species,
Homo sapiens
, which is considered to have originated about two hundred thousand years ago.
Because evidence about controlling fire is inconsistent before the last quarter of a million years, it is often argued that the control of fire was unimportant or absent until that time. But that idea is now particularly shaky because the older part of the record, going back in time from a quarter of a million years ago, has been improving in quality. Two sites in particular give tantalizing hints of what earlier people were doing with fire.
An ancient fireplace at Beeches Pit archaeological site in England securely dated to four hundred thousand years ago lies on the gently sloping bank of an ancient pond. Eight hand axes attest to the presence of humans. Dark patches about one meter (three feet) in diameter with reddened sediments at the margins show where burning occurred. Tails of ashlike material lead down from the fires toward the pond, while the upper side contains numerous pieces of flint. The flints have been knapped, or broken by a sharp blow, and many are burnt. A team led by archaeologist John Gowlett fitted the flint pieces together, and one of the various refits showed that someone had been knapping a heavy core (1.3 kilograms, or 2.9 pounds) until a flaw became obvious. The knapper abandoned it, and two flakes from the series fell forward and were burnt, indicating that the toolmaker apparently had been squatting next to a warming blaze.
Another four-hundred-thousand-year-old site, at Schöningen in Germany, has yielded more than a half dozen superb throwing spears carved from spruce and pine, together with the remains of at least twenty-two horses that appear to have died at the same time as one another, apparently killed by humans. Cut marks show that people removed meat from the horses. At the same site were numerous pieces of burnt flint, four large reddened patches about one meter in diameter that appear to have been fireplaces, and some pieces of burnt wood including a shaped stick, also made from spruce, that had been charred at one end as if it had been used as a poker, or perhaps held over coals to cook strips of meat. This exceptional lakeshore find by archaeologist Hartmut Thieme represents the earliest evidence of group hunting. Thieme suggests that after people killed the horse group, they found themselves with far more food than they could consume at the time. They settled for several days and built the fires along the lakeshore to dry as much meat as possible.
Prior to half a million years ago, there is no evidence for the control of fire in Europe, but ice covered Britain for much of the time between five hundred thousand and four hundred thousand years ago, and glaciers would have swept away most evidence of any earlier occupations. Farther south, however, fire-using is strongly attested at 790,000 years ago. In a well-dated site called Gesher Benot Ya’aqov, next to Israel’s Jordan River, hand axes and bones were first discovered in the 1930s, and in the 1990s, Naama Goren-Inbar found burnt seeds, wood, and flint. Olives, barley, and grapes were among the species of seeds found burned. The flint fragments were grouped in clusters, suggesting they had fallen into campfires. Nira Alperson-Afil analyzed these dense concentrations. She concluded that the early humans who made these fires “had a profound knowledge of fire-making, enabling them to make fire at will.”
Gesher Benot Ya’aqov is the oldest site offering confident evidence of fire control. Before then we find only provocative hints. Archaeological sites between a million and a million and a half years old include burnt bones (at Swartkrans in South Africa), lumps of clay heated to the high temperatures associated with campfires (Chesowanja, near Lake Baringo in Kenya), heated rocks in a hearthlike pattern (Gadeb in Ethiopia), or colored patches with appropriate plant phytoliths inside (Koobi Fora, Kenya). But the meaning of such evidence as indicating human control of fire is disputed. Some archaeologists find it totally unconvincing, regarding natural processes such as lightning strikes as likely explanations for the apparent use of fire. Others accept the idea that humans controlled fire in the early days of
Homo erectus
as well established. Overall, these hints from the Lower Paleolithic tell us only that in each case the control of fire was a possibility, not a certainty.
Evidence of humans controlling fire is hard to recover from early times. Meat can be cooked easily without burning bones. Fires might have been small, temporary affairs, leaving no trace within a few days of exposure to wind and rain. Even now hunter-gatherers such as the Hadza, who live near the Serengeti National Park in northern Tanzania, may use a fire only once, and they often leave no bones or tools at the fire site, so archaeologists would not be able to infer human activity even if they could detect where burning had occurred. The caves and shelters that preserve relatively recent evidence of fire use tend to be made of soft rock, such as limestone, which erodes quickly, so the half-lives of caves average about a quarter of a million years, leaving increasingly few opportunities to find traces of fire use from earlier periods. From the past quarter of a million years there are sites of human occupation where people must have used fire, yet there is no sign of it. There are also mysterious reductions in the frequency of finding evidence of fire, such as one that followed an interglacial period in Europe from 427,000 years ago to 364,000 years ago, when fire evidence was relatively abundant. In short, while humans have certainly been using fire for hundreds of thousands of years, archaeology does not tell us exactly when our ancestors began to do so.