Read The Origin of Humankind Online

Authors: Richard Leakey

The Origin of Humankind (16 page)

The archeological record shows that the imposition of order is slow to emerge in human prehistory—glacially so. We saw in
chapter 2
that Oldowan tools, which date from 2.5 million years ago to about 1.4 million years ago, are opportunistic in nature. Toolmakers apparently were concerned mostly with producing sharp flakes without regard to shape. The so-called core tools, such as scrapers, choppers, and discoids, were by-products of this process. Even the implements in Acheulean tool assemblages, which followed the Oldowan and lasted until about 250,000 years ago, display imposition of form only minimally. The teardrop-shaped handaxe was probably produced according to some form of mental template, but most of the other items in the assemblage were Oldowanlike in many ways; moreover, only about a dozen tool forms were in the Acheulean kit. From about 250,000 years ago, archaic
sapiens
individuals, including Neanderthals, made tools from prepared flakes, and these assemblages, including the Mousterian, comprised perhaps sixty identifiable tool types. But the types remained unchanged for more than 200,000 years—a technological stasis that seems to deny the workings of the fully human mind.

Only when the Upper Paleolithic cultures burst onto the scene 35,000 years ago did innovation and arbitrary order become pervasive. Not only were new and finer tool types produced but the tool types that characterized Upper Paleolithic assemblages changed on a time scale of millennia rather than hundreds of millennia. Isaac interpreted this pattern of technological diversity and change as implying the gradual emergence of some form of spoken language. The Upper Paleolithic Revolution, he suggested, signaled a major punctuation in that evolutionary trajectory. Most archeologists agree generally with this interpretation, although there are differences of opinion over what degree of spoken language earlier toolmakers had—if any.

Unlike Nicholas Toth, Thomas Wynn, of the University of Colorado, believes that Oldowan culture in its general features was apelike, not human. “Nowhere in this picture need we posit elements such as language,” he notes, in a jointly authored article in the journal
Man
, in 1989. The manufacture of these simple tools required little cognitive capacity, he argues, and therefore was not human in any way. Wynn does concede, however, that there is “something humanlike” in the making of Acheulean handaxes: “Artifacts such as these indicate that the shape of the final product
was
a concern of the knapper and that we can use this intention as a tiny window into the mind of
Homo erectus.”
Wynn describes the cognitive capacity of
Homo erectus
, based on the intellectual demands of Acheulean tool production, as equivalent to that of a seven-year-old modern human. Seven-year-old children have considerable linguistic skills, including reference and grammar, and are close to the point where they can converse without recourse to pointing and gesturing. In this connection it is interesting to recall that Jeffrey Laitman judged the language capacity of
Homo erectus
to be equivalent to that of a six-year-old modern human, based on the shape of the basicranium.

Where does this body of evidence, represented in
figure 7.2
, lead us? If we were to be guided only by the technological component of the archeological record, we would view language as having had an early start, slow progress through most of human prehistory, and an explosive enhancement in relatively recent times. This is a compromise on the hypothesis derived from the anatomical evidence. The archeological record of artistic expression, however, allows for no such compromise. Painting and engraving in rock shelters and caves enters the record abruptly, about 35,000 years ago. The evidence in support of earlier artistic work, such as ocher sticks and incised curves on bone objects, is rare at best and dubious at worst.

If artistic expression is taken as the only reliable indicator of spoken language—as the Australian archeologist Iain Davidson, for one, insists—then language not only became fully modern recently but also was initiated recently. “The making of images to resemble things can only have emerged prehistorically in communities with shared systems of meanings,” Davidson states in a recent paper coauthored with William Noble, his colleague at the University of New England. “Shared systems of meanings” are mediated, of course, through language. Davidson and Noble argue that artistic expression was a medium through which referential language developed, not that art was made possible by language. Art had to predate language, or at least emerge in parallel with it. The appearance of the first art in the archeological record therefore signals the first appearance of spoken, referential language.

FIGURE 7.2

Three lines of evidence. If the archeological record (a) can be taken as a guide, language arose late and rapidly in human prehistory. By contrast, information from brain organization and brain size (b) suggests a gradual emergence of language, beginning with the origin of the genus
Homo
. Similarly, the evolution of the vocal tract (c) implies an early origin.

Clearly, the hypotheses about the nature and timing of the evolution of human language are about as divergent as they could be—which means that the evidence, or some of it, is being incorrectly read. Whatever the complexities of this misreading, there is emerging a new appreciation for the complexity of language origins. A major conference in March 1990, organized by the Wenner-Gren Foundation for Anthropological Research, will be seen to set the tone for discussion for years to come. Titled “Tools, Language and Cognition in Human Evolution,” the conference drew links between these important issues in human prehistory. Kathleen Gibson, one of the conference organizers, described the position as follows: “Since human social intelligence, tool use and language all depend on quantitative increases in brain size and in its associated information processing capacities, none could have suddenly emerged full-blown Minerva-like from the head of Zeus. Rather, like brain size, each of these intellectual capacities must have evolved gradually. Further, since these capacities are interdependent, none could have reached its modern level of complexity in isolation.” It will be a considerable challenge to untangle these interdependencies.

As I’ve said, there is more at stake here than the reconstruction of prehistory. The view of ourselves and our place in nature is also on the line. Those who wish to maintain humans as special will welcome evidence that points to a recent and abrupt origin of language. Those who are comfortable with human connection to the rest of nature will not be distressed by an early, slow development of this quintessentially human capacity. I conjecture that if, by some freak of nature, populations of
Homo habilis
and
Homo erectus
still existed, we would see in them gradations of referential language. The gap between us and the rest of nature would therefore be closed, by our own ancestors.

CHAPTER 8
THE ORIGIN OF MIND

T
hree major revolutions mark the history of life on earth. The first was the origin of life itself, sometime prior to 3.5 billion years ago. Life, in the form of microorganisms, became a powerful force in a world where previously only chemistry and physics had operated. The second revolution was the origin of multicellular organisms, about half a billion years ago. Life became complex, as plants and animals of myriad forms and sizes evolved and interacted in fertile ecosystems. The origin of human consciousness, some time within the last 2.5 million years, was the third event. Life became aware of itself, and began to transform the world of nature to its own ends.

What is consciousness? More specifically, what is it
for?
What is its
function?
Such questions may seem odd, given that each of us experiences life through the medium of consciousness, or self-awareness. So powerful a force is it in our lives that it is impossible to imagine existence in the absence of the subjective sensation we call reflective consciousness. So powerful subjectively, yet objectively so elusive. Consciousness presents scientists with a dilemma, which some believe to be unresolvable. The sense of self-awareness we each experience is so brilliant it illuminates everything we think and do; and yet, there is no way in which, objectively, I can know that you experience the same sensation as I do, and vice versa.

Scientists and philosophers have struggled for centuries to pin down this mercurial phenomenon. Operational definitions that focus on the ability to monitor one’s own mental states may be objectively accurate in a sense, but they don’t connect with the way we know we are aware of ourselves and our being. Mind is the source of the sense of self—a sense that is sometimes private, and sometimes shared with others. The mind is also a channel for reaching worlds beyond the material objects of everyday life, through imagination; and it offers us a means of bringing abstract worlds into Technicolor reality.

Three centuries ago, Descartes tried to grapple with the disquieting mystery of the source of the sense of self which arises within oneself. Philosophers have referred to this dichotomy as the mind-body problem. “It feels as if I have fallen unexpectedly into a deep whirlpool which tumbles me around so that I can neither stand on the bottom nor swim up to the top,” Descartes wrote. His solution to the mind-body problem was to describe the mind and the body as entirely separate entities, a dualism that made a whole. “It was a vision of the self as a sort of immaterial ghost that owns and controls a body the way you own and control your car,” observes the Tufts University philosopher Daniel Dennett in his recent book
Consciousness Explained
.

Descartes also considered the mind to be the sole preserve of humans, while all other animals were mere automatons. A similar view has dominated biology and psychology for the past half century. Known as behaviorism, this worldview held that nonhuman animals merely respond reflexively to events in their worlds and are incapable of analytical thought processes. There is no such thing as animal mind, said the behaviorists; or, if there is, we have no way of gaining access to it in a scientific way, and so it should be ignored. This view has been changing of late, thanks largely to Donald Griffin, a behavioral biologist at Harvard University, who has been waging a campaign for two decades to overthrow this negative view of the animal world. He has published three books on the subject, the latest,
Animal Minds
, in 1992. Psychologists and ethologists have seemed to be “almost petrified by the notion of animal consciousness,” he suggests. This is a consequence, he says, of the continued influence of behaviorism, hanging like a ghost over the science. “In other realms of scientific endeavor we have to accept proof that is less than a hundred percent rigorous,” says Griffin. “The historical sciences are like that—think of cosmology, think of geology. And Darwin couldn’t prove the fact of biological evolution in a rigorous way.”

Anthropologists, in trying to explain the evolution of the human form, must ultimately also address the evolution of human mind—and, specifically, human consciousness, a subject biologists are more prepared to contemplate. We also have to ask
how
such a phenomenon arose in the human brain: that is, did it spring fully formed into the brain of
Homo sapiens
, having had no precursor of any kind in the rest of the world of nature, as the behaviorists’ view would imply? We can ask, When in human prehistory did consciousness reach the stage we now experience: did it arise early, and grow ever brighter throughout prehistory? And we can ask, What evolutionary advantages would such a property of mind have conferred on our ancestors? Notice that these questions are parallel to those concerning the evolution of language. This is not mere coincidence, for language and reflective self-awareness are undoubtedly closely linked phenomena.

In seeking answers to these questions, we cannot eschew the issue of what consciousness is “for.” As Dennett asks, “Is there anything a conscious entity can do for itself that an unconscious (but cleverly wired up) simulation of that entity can’t do for itself?” The Oxford University zoologist Richard Dawkins admits to being puzzled, too. He speaks of the need for organisms to be able to predict the future, an ability that is achieved through the equivalent in brains of simulation in computers. This process, he asserts, need not be conscious. And yet, he notes, “the evolution of the capacity to simulate seems to have culminated in subjective consciousness.” Why this should have happened is, he contends, the most profound mystery facing modern biology. “Perhaps consciousness arises when the brain’s simulation of the world becomes so complete that it must include a model of itself.”

There is always the possibility, of course, that it is not “for” anything and is merely a by -product of big brains in action. I prefer to take the evolutionary point of view, which holds that so powerful a mental phenomenon is likely to have conferred survival benefits and was therefore the product of natural selection. If no such benefits can be discerned, then perhaps the alternative—that is, no adaptive function—may be entertained.

The neurobiologist Harry Jerison has made a long study of the trajectory of brain evolution since the advent of life on dry land. The pattern of change through time is quite striking: the origin of major new faunal groups (or groups within groups) is usually accompanied by a jump in the relative size of the brain, known as encephalization. For instance, when the first archaic mammals evolved, some 230 million years ago, they were equipped with brains four to five times bigger than the average reptilian brain. A similar boost in mental machinery happened with the origin of modern mammals, 50 million years ago. Compared with mammals as a whole, primates are the brainiest group, being twice as encephalized as the average mammal. Within the primates, the apes have the biggest brains; they are some twice the average size. And humans are three times as encephalized as the average ape.

Leaving humans aside for the moment, the stepwise increase in brain size through evolutionary history might be taken to imply a progression of ever-greater biological superiority: bigger brains mean smarter creatures. In some absolute sense this must be true, but it is useful to take an evolutionary view of what is happening. We might think of mammals as being somehow smarter and superior to reptiles, somehow better able to exploit the resources they need. But biologists have come to realize that this is not true. If mammals were indeed superior in their exploitation of niches in the world, then a greater diversity of ways of doing it, as reflected in the diversity of genera, might be expected. However, the number of mammalian genera that have existed at any point in their recent history is about the same as the number of dinosaur genera, those mightily successful reptiles of an earlier era. Moreover, the number of ecological niches that mammals are able to exploit is comparable to the number of dinosaur niches. Where, then, is the benefit of a bigger brain?

One of the forces that drive evolution is a constant competition among species, in the course of which one species gains temporary advantage through an evolutionary innovation, only to be overtaken by a counterinnovation, and so on. The outcome is the development of apparently better ways of doing things, such as running faster, seeing more acutely, withstanding attack more effectively, being smarter—while no permanent advantage is secured. In military parlance, this process is known as an arms race: weapons may become more numerous or effective on both sides, but neither side ultimately benefits. Scholars have imported the term “arms race” into biology to describe the same phenomenon in evolution. The building of bigger brains may be seen as the consequence of arms races.

Something different must go on in bigger brains as compared with smaller ones, however. How are we to view that something? Jerison argues that we should think of brains as creating a species’ version of reality. The world we perceive as individuals is essentially of our own making, governed by our own experience. Similarly, the world we perceive as a species is governed by the nature of the sensory channels we possess. Any dog owner knows that there is a world of olfactory experience to which the canine but not the human is privy. Butterflies are able to see ultraviolet light; we are not. The world inside our heads—whether we are a
Homo sapiens
, a dog, or a butterfly—is formed, therefore, by the qualitative nature of the information flow from the outside world to the inside world, and the inside world’s ability to process the information. There is a difference between the real world, “out there,” and the one perceived in the mind, “in here.”

As brains enlarged through evolutionary time, more channels of sensory information could be handled more completely, and their input integrated more thoroughly. Mental models therefore came to equate the “out there” and “in here” realities more closely, albeit with some inevitable information gaps, as I just mentioned. We may be proud of our introspective consciousness, but we can be aware only of what the brain is equipped to monitor in the world. Although language is seen by many as a tool of communication, it is also, argues Jerison, a further means by which our mental reality is honed. Just as the sensory channels of vision, smell, and hearing are of especial importance to certain animal groups in the construction of their particular mental worlds, language is the key component for humans.

There is a large literature, in philosophy and psychology, relating to the issue of whether thought depends on language or language on thought. There’s no question that a lot, perhaps most, of human cognitive processes go on in the absence of language or even consciousness. Any physical activity, such as playing tennis, goes on largely automatically—that is, without a literal running commentary on what to do next. The solution to a problem that pops into the mind while one is thinking of something else is another clear example. To some psychologists, spoken language is merely an afterthought, so to speak, of more fundamental cognition. But language surely shapes elements of thought in a way that a mute mind cannot, so that Jerison is justified in his contention.

The most obvious change in the hominid brain in its evolutionary trajectory was, as noted, a tripling of size. Size was not the only change, however; the overall organization changed, too. The brains of apes and humans are constructed on the same basic pattern: both are divided into left and right hemispheres, each of which has four distinct lobes: frontal, parietal, temporal, and occipital. In apes, the occipital lobes (at the back of the brain) are larger than the frontal lobes; in humans, the pattern is reversed, with large frontals and small occipital lobes. This difference in organization presumably underlies in some way the generation of the human mind as opposed to the ape mind. If we knew when the change in configuration occurred in human prehistory, we would have a clue about the emergence of human mind.

Fortunately, the outer surface of the brain leaves a map of its contours on the inner surface of the skull. By taking a latex mold of the inner surface of a fossilized cranium, it is possible to get an image of an ancient brain. The story that emerges from an investigation of this sort is dramatic, as Dean Falk discovered in her study of a series of fossil crania from South and East Africa. “The australopithecine brain is essentially apelike in its organization,” she states, referring to the relative sizes of the frontal and occipital lobes. “The humanlike organization is present in the earliest species of
Homo
.”

We have seen that many aspects of hominid biology changed when the first
Homo
species evolved, such as body stature and patterns of developmental growth—changes I view as signaling a shift to the new adaptive niche of hunting and gathering. A change in the organization as well as size of the brain at this point is therefore consistent and makes biological sense. How much of human
mind
is in place at this point, however, is less easy to determine. We need to know about the minds of our closest relatives, the apes, before we can address this question.

Primates are quintessentially social creatures. Just a few hours in the presence of a troop of monkeys is sufficient to get a sense of the importance that social interaction has for its members. Established alliances are constantly tested and maintained; new ones are explored; friends are to be helped, rivals challenged; and constant vigilance is kept for opportunities to mate.

The primatologists Dorothy Cheney and Robert Seyfarth, of the University of Pennsylvania, have devoted years to watching and recording the life of several troops of vervet monkeys in Amboseli National Park, in Kenya. To the casual observer of the monkeys, outbursts of activity, which are often aggressive, can look like social chaos. However, knowing the individuals, knowing who is related to whom, and knowing the structure of alliances and rivalries, Cheney and Seyfarth are able to make sense of the apparent chaos. They describe a typical encounter: “One female, Newton, may lunge at another, Tycho, while competing for a fruit. As Tycho moves off, Newton’s sister Charing Cross runs up to aid in the chase. In the meantime, Wormwood Scrubs, another of Newton’s sisters, runs over to Tycho’s sister Holborn, who is feeding 60 feet away, and hits her on the head.”

Other books

Wish by Barbara O'Connor
ModelLove by S.J. Frost
Sun Dance by Iain R. Thomson
Assassin's Touch by Laura Joh Rowland
Here for You by Wright, KC Ann
Colters' Woman by Maya Banks


readsbookonline.com Copyright 2016 - 2024