The Story of Psychology (109 page)

FIGURE 44
How can you tell what the partly obscured word is?

You probably recognized instantly that the word is pen. But how did you know that? Each partly obscured letter could have been other than the one you took it to be.

The explanation (based on a similar example by Rumelhart and Jay McClelland): The vertical line in the first letter is an input into your recognition system that strongly connects to units in which P, R, and B are stored; the curved line connects to all three. On the other hand, the sight of the straight line does not connect—or, one can say, is strongly inhibited from connecting—to any unit representing rounded letters like C or O. Simultaneously, what you can see of the second letter is strongly connected to units registering F and E, but crossfeed from the first letter connects strongly to E not F, because experience has established
PE, RE, and BE but not PF, RF, or BF as the beginning of an English word. And so on. Many connections, all operating at the same time and in parallel, enable you to see the word instantly as pen and not as anything else.
109

On a larger scale, the connectionist model of information processing is in striking accord with other seminal findings of cognitive psychological research. Consider, for instance, what is now known about the semantic memory network in
Figure 41.
Each node in that network— “bird,” “canary,” and “sing,” for instance—corresponds to a connectionist module something like the entire array in the last diagram but perhaps consisting of thousands of units rather than eight.
110
Imagine, if you can, enough such multithousand-unit modules to register all the knowledge stored in your mind, each with millions of connections to related modules, and… But the task is too great for imagination. The connectionist architecture of the mind is no more possible to visualize in its entirety than the structure of the universe; only theory and mathematical symbols can encompass it.

The connectionist model is strongly analogous to actual brain structure and function. The late Francis Crick, who shared a Nobel Prize for discovering the structure of DNA and then did neuroscience research at the Salk Institute, said that the concept of the brain as a complex hierarchy of largely parallel processors “is almost certainly along the right lines.”
111
Paul Churchland and Patricia Churchland—each a philosopher of cognitive science—have said that the brain is indeed a parallel machine “in the sense that signals are processed in millions of different pathways simultaneously.” Each aggregation of neurons sends millions of signals to other aggregations and receives return signals from them that modify its output in one way or another. It is these recurrent patterns of connection that “make the brain a genuine dynamical system whose continuing behavior is both highly complex and to some degree independent of its peripheral stimuli.”
112
Thus could Descartes, lying abed all morning, think about his own thoughts, as has many a psychologist since.

Possibly the most remarkable development of all is, as noted above, the change in the relationship between computer and mind. A generation ago, it seemed that the computer was the model by which the reasoning mind could be understood. Now the order has been reversed: The reasoning mind is the model by which a more intelligent computer can be built. Artificial intelligence researchers have been writing programs that simulate the parallel processing of small neural networks, their
aim being to create AI programs that are more nearly intelligent than those based on serial processing, and to create programs that simulate hypothesized mental processes so that they can be tested on a computer.

A wonderful irony: The brain that makes mind possible turns out to be the best model for the machine that had been thought vastly superior to it, a model so complex and intricate that it is all the computer can do, for now, to replicate a few of its multitude of functions and make only symbolic simulations of a handful of others.

As David, the greatest of psalmists, sang twenty-five centuries before the cognitive revolution and the computer age, “I will praise thee; for I am fearfully and wonderfully made.”

And the Winner Is—

We have followed the revolutionary development of cognitive psychology and the later but equally revolutionary development of cognitive neuroscience, which currently coexist, overlapping and infiltrating each other. But will they continue to do so or is one likely to dominate and absorb the other, becoming the psychology of the future? The answer would seem to depend on which discipline offers the better scientific explanation of mental processes and behavior.

Cognitive psychology
, as we have seen, has compiled a remarkable record over the past six decades. Escaping from the severe limitations of behaviorist theory, it rediscovered the mind and found innumerable ways to investigate the unseen processes, among them perception, learning, memory, emotion, personality development, and social behavior, that take place in it. Cognitive psychologists were free to ask, once again, the great questions the Greek philosophers asked so long ago, summed up in the megaquestions “How do we know what we know, and why do we behave as we do?”

As is the case with other sciences, the proliferation of hypotheses and collecting of empirical evidence by cognitive psychologists has often produced corrections and drastic revisions of theories, minitheories, and data, but seen in perspective, cognitive psychology has been a cumulative, self-correcting, self-transforming science.

Its one great shortcoming has always been its lack of an adequate explanation of how the activity of billions of neurons in the brain can result in thoughts, emotions, and voluntary actions. As the neuropsychologist
V. S. Ramachandran and science writer Sandra Blakeslee wrote a few years ago, “Many people find it disturbing that all the richness of our mental life—all our thoughts, feelings, emotions, even what we regard as our intimate selves—arise entirely from the activity of little wisps of protoplasm in the brain. How is this possible? How could something as deeply mysterious as consciousness emerge from a chunk of meat inside the skull?”
113

In an effort to answer this question, ever since the early days of the cognitive revolution many psychologists have reached beyond the classic boundaries of their field to explain what they were studying in terms of hormonal, genetic, and other physiological factors. And for the past two and a half decades, as we have seen throughout this chapter, many psychologists have turned to the methods of cognitive neuroscience, especially brain scanning, to help validate their psychological hypotheses. But valuable as all this is, it still does not tell us how a blizzard of neural impulses becomes thought or other mental processes.

Cognitive neuroscience
, especially since the advent of brain scans, has been compiling a record of advances in knowledge as impressive as that of cognitive psychology. The neuroscientists have traced neuronal pathways from sense receptors to various loci in the brain, located the areas where emotions are generated, shown that memories are stored in distributed network fashion, and in general extended their research deep into the territory of cognitive psychology, amassing a great deal of information about what areas of the brain are active in mental imagery, attention, speech, learning, voluntary and involuntary action, and other areas of classic psychological interest.

All of which is impressive and almost certainly will be the foundation on which some day a fuller explanation of how the brain becomes mind may be forthcoming. But not yet. The authors of one impressive tome of neuroscience write, “In this book, we explore how the brain actually does enable mind”
114
—but by “enable” they seem to mean something less than explain how the synaptic events become mental events. I asked Martha Farah, director of the Center for Cognitive Neuroscience at the University of Pennsylvania, if the problem were not akin to that of trying to account for the movement of a wave in terms of the movements of individual molecules of water; she laughed and said, “Fluid mechanics is independent of molecular physics. But cognition may not be describable without some details of neuronal function being in the picture.”

Thus, a mental process as simple as a word retrieved from memory
cannot be equated with the firing of millions of neurons and the resultant billions of synaptic transmissions but is the product of the pattern or structure of those firings and transmissions. Mental phenomena such as speech, memory retrieval, and reasoning are governed not by the laws of neural activity but by those of cognitive psychology. The former logo of the journal
Cognition
is a striking example of this distinction:

FIGURE 45
Levels of reality: molecules, letters, words, impossible objects

The design is made up of molecules of ink on paper, a reality that has nothing to do with its meaning. At a higher level of organization, the molecules make up letters, which individually are symbols without meaning but as here organized make up the word “cognition.” But we are not done. The design, though it looks real and three-dimensional, is an object that cannot exist in the real world; the paradoxical illusion is a mental epiphenomenon. Explain that, if you can, in terms of molecules of ink, letters, or bursts of energy in the neurons of the visual cortex.

Whether or not there is ever a full and satisfying explanation of mental events in neural terms, the revolutions of cognitive psychology and cognitive neuroscience have been successful side by side, overlapping, and in concert with each other.

As for the title of this section, “And the Winner Is—,” we now seem to be at the top of the tenth (or eleventh? or twelfth?)—and still tied.

*
A bit, in information theory, is the smallest unit of information: it is equal to a simple yes or no. A digit or a letter of the alphabet is equal to several bits.

*
The comment, made in 2002, still holds true.

*
Lacking the toy, you can play the game with three or more coins of different sizes. Draw three squares on a sheet of paper, pile the coins in one of them, decide which square the pile is to end up in, and start. The three-coin game is easy, the four-coin game not so easy, and the five-coin game quite hard.

*
The answer:
PROBLEMS

*
The only correct deduction is that some of the chess players are not archaeologists.

*
Storage of information in the brain has long been thought to be the result of some kind of unexplained strengthening of the synapses involved in any learning experience. Recent neurophysiological research, too arcane to be fully spelled out here, has established that in any form of learning, a series of at least 15 steps involving 100 or more different molecules takes place, switching on certain genes. These make the post-synaptic neuron more easily activated by the presynaptic neuron’s release of various neurotransmitters (Marcus, 2004:100). In addition, the process induces the growth of additional synaptic connections on the presynaptic side (Kandel, 2006, chaps. 14, 17, 19). These changes, in effect, record information, although any elementary item in memory—a shape, say, or a sound—may require that a vast number of strengthened synapses, linked in a network, fire together.


The solution continues as follows:

S must be either 8 or 9, depending on whether or not there is a carry. Substituting 1 for M, where S + 1 = O, we see that O can be only 0 or 1. But M is 1, so O must be 0; therefore S must be 9 and there is no carry.

In the second column from the left, E + 0 would be E unless there is a carry; hence there must be a carry. So E + 1 = N.

If E is odd, N is even, and vice versa. If E is odd, it can only be 3, 5, or 7 (1 and 9 are already assigned). Try 3. And so on.

SEVENTEEN
The
Psychotherapists
Growth Industry

L
et us indulge in a bit of fantasy. Wilhelm Wundt, invisible except to us, returns from Somewhere to see what has become of the science he launched more than a century ago.

Stern and formal in his black lecture gown, the shade of the Herr Professor stares uncomprehendingly as some of his intellectual descendants, at a cognitive science conference, discuss the molecular basis of memory in the giant sea snail, while others speak of a computer program that simulates parallel distributing processing. Elsewhere, however, he permits himself an uncharacteristic beaming smile when he hears that six decades ago there were only about 4,000 psychologists in America but today there are some 180,000, (about half at the doctorate level, half at the master’s level), a nearly forty-five-fold growth.
1

When, however, the vaporous Dr. Wundt drifts into the offices of the American Psychological Association, his smile turns to a dark scowl. For here he learns that during the past several decades most new Ph.D.’s in psychology have become not researchers but industrial, educational, and—by far the largest number—clinical and counseling psychologists.
2
Wundt had adamantly opposed educational psychology and similar practical applications of the science, but
this—
listening and talking to people about their personal problems—is the worst, a detestable degradation of psychology. And he is horrified when he hears that most Americans, these days, think of a psychologist as someone who treats patients with mental health problems.
3
Schrecklich!

Other books

The Sheik's Son by Nicola Italia
Without a Doubt by Lindsay Paige
A Pint of Murder by Charlotte MacLeod
The Lover's Dictionary by David Levithan


readsbookonline.com Copyright 2016 - 2024