Read The Idea Factory: Bell Labs and the Great Age of American Innovation Online
Authors: Jon Gertner
His group was not lacking in brainpower. Baker pulled in John Pierce and John Tukey from Bell Labs—“the country’s keenest thinkers,” both of whom now had top secret clearances—along with several other scientists, including the future Nobel physicist Luis Alvarez.
34
The goal, as stated in Baker’s 1957 description of his committee, was “to search for new concepts of interconversion of information and intelligence.” In other words, his group would consider all the ways that technology now allowed information to be hidden and transmitted—through encoded signals, and even through chemical and biological patterns—and then recommend how America’s intelligence agencies might respond. More specifically, the Baker Committee sought insights into how the United States might develop the ability to crack any imaginable Soviet code. “Our history sustains the belief that for both the national security and the universal freedom of man,” Baker wrote, in a passage that held its own with the most artful cold war rhetoric, “the applications of all science to foreign deciphering political and military communications (basically revealing
attitudes of nations toward each other) is a suitable and worthy intent.”
35
Much of the Baker Committee’s work was done at Arlington Hall Station, an Arlington, Virginia, facility used by the NSA, in the spring of 1957. In mid-July, the committee delivered what came to be known as the Baker Report, still mostly classified five decades later, which urged the government to move strenuously toward using solid-state computers and digital technologies for cryptanalysis. In his history of the NSA, James Bamford described the Baker Report as recommending “a Manhattan Project–like effort to push the USA well ahead of the Soviet Union and all other nations” through the application of information-age tools. Bamford also noted that one of the committee’s enduring legacies was its recommendation that the U.S. intelligence networks establish “a close yet secret alliance with America’s academic and industrial communities.”
36
Few outsiders were more closely, or more secretly, allied than Baker. In a couple of years’ time, John Pierce would come to despise Washington and avoid government work entirely. Claude Shannon, who was also involved with the NSA and CIA on several projects, would drift off to more eccentric pursuits at home in Massachusetts. Baker remained a consultant to the NSA for more than a decade, and began to branch out in Washington. Beginning with Eisenhower, and continuing through the Kennedy, Johnson, and Nixon administrations, and later again under Ford and Reagan—always leaning toward the G.O.P., a reflection of the fact that he was, deep down, a conservative, “the eminence grise of Republican science,” as the British magazine
New Scientist
described him
37
—Baker served as a member of the President’s Foreign Intelligence Advisory Board (PFIAB), a group that examined the operations of the CIA and other intelligence agencies. As part of his PFIAB work, he helped found in 1960 the National Reconnaissance Office, a government organization charged with planning, building, launching, and maintaining America’s spy satellites. The NRO remained secret for its entire first decade.
Baker shuttled between New Jersey and Washington, among his five offices—at Bell Labs’ Murray Hill and Holmdel locations, the White House, the Pentagon, and the State Department. His duties in Washington were often broader than his titles suggested. Soon after his inauguration,
President Kennedy asked Baker to explain some of the protocols for ballistic missile attacks, a technology that relied on the early warning systems that Bell Labs had built for the military. The men chatted in the Oval Office. A crucial telephone with a direct line to the Defense Department happened to be missing from Kennedy’s desk. “Kennedy and [his science advisor Jerome] Wiesner and I got down on our hands and knees and we got under the desk and found somebody had put it in a drawer,” Baker later remarked. “And then we explained the whole technology.” Kennedy often invited Baker into his private quarters for discussions; he likewise called him at his Bell Labs office, at least once trying to find out if Jim Fisk might take a job in the administration.
38
During the Cuban Missile Crisis—a crisis brought on by the interpretation of information, in this case aerial photographs—Baker also became a fixture in the cabinet room.
39
In the early 1960s, Baker’s closest associate in Washington was Clark Clifford, an advisor to Kennedy and the chairman of the Foreign Intelligence Advisory Board. At the PFIAB, Baker and Edwin Land, the inventor of the Polaroid camera, drove the agenda. “These two men were our teachers,” Clifford would later recall, “turning all of us on the committee into missionaries for the view that the United States should vastly increase its commitment to the finest state-of-the-art technologies in the field of electronic, photographic, and satellite espionage.”
40
Baker was the stealthiest of cold warriors, who preferred shadows to the open air. In 1961, just after Kennedy was inaugurated, Baker trudged through a snowstorm for an appointment with Vice President Lyndon Johnson, whom Kennedy had asked to offer Baker the job as the head of NASA. Baker diplomatically refused, and instead suggested that Johnson offer it to James Webb, which he eventually did. As Baker related the story, he told Johnson, “You’ve got to have somebody who can reflect the public, political and popular attitudes in this business much better than I.” Whether he believed this, or whether it was simply a politic excuse for maintaining his responsibilities at Bell Labs and his secret responsibilities in government, was a matter of debate. What was undeniable was that Baker wanted little to do with a highly visible public office and
a large degree of public accountability. Over time, a half dozen U.S. presidents offered him the post of science advisor. He turned it down every time.
New titles might not have increased his influence. By the start of the 1960s Baker was engaged in a willfully obscure second career, much like the one Mervin Kelly had formerly conducted, a career that ran not sequentially like some men’s—a stint in government following a stint in business, or vice versa—but in parallel, so that Baker’s various jobs in Washington and his job at Bell Labs intersected in quiet and complex and multifarious ways. Baker could bring innovations in communications to the government’s attention almost instantly. “Many thanks for the very interesting and profitable tour you arranged for me at the laboratory,” Louis Tordella, the director of the National Security Agency, wrote Baker in 1959. “I particularly enjoyed the brief glimpses into the future shape of things to come. Even of more immediate value, however, were the talks I had, particularly with you and Dr. Pierce.”
41
In understanding the leading research for coding and transporting information, Baker also had a unique ability to tell others how it might be intercepted. He therefore did more than connect government officials with technological hardware. He explained to Washington how information works, and how it flows.
B
ill Baker’s view of the future was grand, and with good reason: In the late 1940s, as one of the Young Turks, he had observed firsthand the transistor’s invention and early development, and he had since come to understand its possibilities both for society at large and the military in particular. “So often,” says Ian Ross, who worked in Jack Morton’s department at Bell Labs doing transistor development in the 1950s, “the original concept of what an innovation will do”—the replacement of the vacuum tube, in this case—“frequently turns out not to be the major impact.”
1
The transistor’s greatest value was not as a replacement for the old but as an exponent for the new—for computers, switches, and a host of novel electronic technologies.
Years before, when Claude Shannon had written of the transistor to his old schoolteacher, “I consider it very likely the most important invention of the last fifty years,” he’d seen this. It helped that Shannon had spent most of his career dwelling on the possibilities of digital information. The transistor was the ideal digital tool. With tiny bursts of electricity, it could be switched on or off—in essence, turned into a yes or no, or a 1 or 0—at speeds measured in billionths of a second. Thus in addition to being an amplifier, a clump of transistors could be linked together to
enable a logical decision (and thereby
process
information). Or a clump could be linked together to help represent bits of information (and thereby
remember
information). To put hundreds, or thousands, or tens of thousands of the devices alongside one another (the notion that billions would one day fit together was still unimaginable) might allow for extraordinary possibilities. It was a “wondrous coincidence,” as Bill Baker described it, “that all of human knowledge and experience can be completely and accurately expressed in binary digital terms.”
2
As usual, Shannon was ahead of his colleagues. But in only a few years, by the late 1950s, Baker, too, viewed the future of digital computing and that of human society as wholly interrelated.
3
A
STEADY STREAM
of semiconductor inventions emerged from Bell Labs between 1950 and 1960. Some had arisen from Baker’s research department and others from the much larger development department. As a result, there were now a multitude of new transistor types and important new methods of manufacturing, such as the technique—resembling art etchings done on a minuscule scale—known as photolithography. In keeping with AT&T’s agreement with the federal government, the patents for these inventions and processes were licensed to a number of other companies, not only large industrial shops like General Electric and RCA, but also two fledgling semiconductor companies known as Texas Instruments and Fairchild Semiconductor. Fairchild had been established after its principal engineers, Robert Noyce and Gordon Moore, had fled Bill Shockley’s unhappy company along with several other colleagues to start a company of their own. In that “precompetitive era,” as Ian Ross terms it, information was freely exchanged. Ross and Bell Labs researchers such as Morry Tanenbaum, who had invented the silicon transistor, would often speak with the men at Fairchild in Palo Alto and Texas Instruments in Dallas; employees of those start-up companies would also swing by Bell Labs on their trips east to see what was new. Jack Kilby, a young engineer who eventually joined Texas Instruments, first made a pilgrimage to Bell Labs in 1952 to attend one of the earliest seminars on how to
build transistors.
4
Bell Labs was where most of the world’s knowledge about semiconductors resided. “Remember, we were the only game in town when it started,” Ian Ross says. “If you wanted to know about semiconductor devices, you went to Murray Hill and Building 2.”
The good thing about the transistor was that by the late 1950s it was becoming smaller and smaller as well as more and more reliable. The bad thing was that an electrical circuit containing thousands of tiny transistors, along with other elements such as resistors and capacitors, had to be interconnected with thousands of tiny wires. As Ian Ross describes it, “as you built more and more complicated devices, like switching systems, like computers, you got into millions of devices and millions of interconnections. So what should you do?” At Bell Labs, Jack Morton, the vice president of device development, had coined a name for the dilemma: “the tyranny of numbers.” Morton believed that one way to tackle the tyranny of numbers was simply to reduce the number of components (transistors, resistors, capacitors, and so forth) in a circuit. Fewer components meant fewer interconnections. One way to do this, Morton thought, was to harness the physical properties of special semiconductors so that they might be made to perform multiple electronic tasks—turning them into a kind of electronic Swiss Army knife. Therefore “a simple thing” within a circuit could replace multiple components. Morton called these “functional devices,” but they were proving exceedingly difficult to engineer. What’s more, Morton, fiercely intelligent and widely feared, was largely unwilling to hear criticism or change course. As Bill Baker wrote in a wry note to Jim Fisk, Jack was someone with “an intense distaste for anyone opposing him.”
5
Jack Kilby at Texas Instruments and Robert Noyce at Fairchild had different, better ideas. Both men, nearly simultaneously, came up with the idea of constructing all of the components in a circuit out of silicon, so that a complete circuit could exist within one piece—one
chip
—of semiconductor material. By eliminating the tyranny of interconnections, the method seemed to suggest substantial advantages in manufacturing and operational speed. Their innovation could, in short, be better and cheaper. Kilby had the idea in the summer of 1958, probably a few months
earlier than Noyce. But Noyce’s design was arguably more elegant and more useful.
6
In the early days, the product that Kilby and Noyce designed was known as a “solid circuit.” In 1960—the year that
Business Week
crowned solid-state electronics as “the world’s fastest growing industry”—Kilby and Noyce’s invention seemed promising but still unproven. Not long after, the new idea became known as the integrated circuit.
To the engineers and scientists at the Labs, the integrated circuit was not a complete surprise. “We knew we could make multiple transistors within a piece of silicon, we knew we could make resistors, we knew we could make capacitors,” Ian Ross recalls. But it was the received wisdom under Jack Morton, Ross adds, that such devices could
never
be reliable. Even though the quality of manufactured transistors was improving, there was still a significant failure rate. And on a chip with hundreds or thousands of components? Some of those components would inevitably fail, thus rendering the entire device useless. Kilby and Noyce opted to believe, correctly, that the manufacturing challenges could be worked out later. Morry Tanenbaum happened to visit Noyce’s shop during a visit to California just as Noyce was developing the chip idea. “One of the things he showed me,” Tanenbaum recalls, “was one of the first integrated circuits. And I said, ‘Boy, that’s really neat.’ And when I went back I told some people about it. It certainly looked important to me, but I can’t say I had the imagination to see how far it would go.”