Read The Watchers Online

Authors: Shane Harris

The Watchers (40 page)

The analysts were the library patrons. And when they needed something from the shelves—an intercept, a recording—they put a request into their computer terminals. Then, an unseen team of electronic librarians took over. They disappeared into the shelves to find the requested file. Then they presented it to the analysts.
These librarians didn't work quickly. They followed a rigid, inflexible approach each and every time they fetched a book. They walked back to the stacks, found the book in question, pulled it off the shelf, put it on a cart, logged it out, handed it to the patron, and then waited for the return, at which point they put the book back on the cart, walked it back to the shelf, and put it in the proper place.
This wasn't an abstract process. It was physical. This was how computers retrieved information from databases. Information was stored on a physical medium—a disk. That disk was no different from the bookshelves. Each piece of information resided in a specific location on the disk. And when the computer wanted to find it, the disk had to spin to just the right point, and another mechanism had to retrieve it so the user could see it. The process was essentially no different for an NSA analyst than for an ordinary home computer user calling up a document from his word processor. That document was stored on a hard drive, which was just another disk. Behind his simple point and click, a set of electrical and mechanical steps played out. People saw something similar when they played a jukebox. They fed the machine a dime and selected a song, then watched the gears spin as a metal arm dropped down, pulled the right record out of a rotating stack, grabbed onto it, and then set the record on the turntable.
The computers at the NSA—the giant library, the big jukebox—spent a lot of time and energy fetching and reshelving information. It could take several minutes, if not hours, for analysts to call up all the data they needed. This was where the real-time aspect of terrorist hunting broke down. This was the bottleneck.
But there was another way to store and retrieve information, and it was also well known to anyone who'd ever used a computer. A word processor, an e-mail in-box, a Web browser, any application used to perform work, was made possible by something called random access memory. RAM was a storage system too, but it was nothing like a library. It didn't rely on the physical sequence of moving parts to deal with data.
RAM, as its name implied, used a “random” structure to organize information. Once something was put into RAM, it was simply
there,
available for the taking. It was as if all the books in the library had been put in one big room instead of being dispersed to far-flung stacks. And unlike the disk-based librarians, with their inviolate sequences, the librarians in RAM were all on speed, zipping around and grabbing books in an instant compared with their slower counterparts. If disk world was like a jukebox, RAM was an iPod.
But RAM had its problems. For starters, it was highly “unstable,” from an engineering perspective. RAM could not save information without a constant source of electricity. Turn off the computer, or lose power, and anything stored in RAM was lost. This was why home computers and databases alike used disk-based storage; it retained information regardless of whether the machine was on or not. RAM was also more expensive than disk memory. Its market price could fluctuate wildly, depending on global demand. That was one reason why computers mostly used RAM to run applications—word processors, Web browsers, or the tools that intelligence analysts used to work on stored information. Its random structure gave these memory-intensive programs the running room they needed to operate smoothly, without having to rely on those pesky, slow librarians. RAM was not a storage mechanism.
But the NSA thought it could be. From the agency's perspective, RAM's instability and cost were surmountable obstacles. Money was no barrier for an agency with a multibillion-dollar annual budget. And as for the power supply, the 350-acre Fort Meade campus was the single largest customer of Baltimore Gas & Electric, consuming the same amount of electricity as the city of Annapolis. Back in the late 1990s officials had started to worry about whether the power would run out. But in the heat of the global terrorist hunt, the issue slipped down the priority list. As far as the agency was concerned, electricity constraints were not going to stop technological progress.
And so, beginning in 2004, the NSA began a shift toward “in-memory” databases that were built entirely with RAM. The agency would place oceans of telecom data into these new systems and hope at last to have their real-time terrorist tracker. It was an unprecedented move for such a large organization. It was extravagantly expensive. And the agency protected its new approach like a national secret.
The NSA had found its breakthrough.
 
Though they were expensive and unstable, in-memory databases had one undeniable advantage over their disk-based cousins—speed. And that was just what the NSA wanted.
In-memory databases were experimental, contemplated mostly within the cloisters of computer geekdom. But early on, engineers could see their promise. In 2001 a group of database builders in Washington State decided to test the speed of a disk-based database and one built entirely in memory. Each system was told to retrieve and store thirty thousand individual records, a straightforward and simple computing task. It was a mere sliver of the NSA's workload, but the test yielded staggering results.
While the traditional machine took sixteen seconds to retrieve, or “read,” the records, the in-memory version did it in one second. But more stunning, it took the traditional computer almost one hour to store, or “write,” the records onto its disks. The in-memory machine stored the information in 2.5 seconds.
In-memory databases were the NSA's best shot at real-time analysis. So how to build the system? Simple enough. Just construct a computer with lots and lots and lots of RAM. Or harness together many computers with the same attributes.
Simple in theory. Even the most opaque computer engineers didn't mince words about what this amalgamation of hardware would look like. Huge. A supercomputer in Maryland, which comprised more than a thousand linked machines and was used by the National Weather Service to predict hurricane paths, took up seven thousand square feet of floor space. And that machine wasn't working in memory. Who could say how much real estate this NSA supereye would need?
Few agencies had the tens or hundreds of millions of dollars required to install giant computer farms in their basement, much less pay the power bill for cooling them. (From its previous experiences with supercomputers and large databases, the NSA engineers knew that a horde of machines in one room generated extraordinary amounts of heat. The agency had to design specially cooled rooms to keep the machines from melting down.)
But the in-memory system had another flaw. One that the BAG and all other terrorist-hunting devices shared.
It lacked what data engineers called a logic layer, a kind of vocabulary that told a computer what the cacophony of phone records and e-mails, words and numbers running through its brain actually meant, and more important, what they meant in relation to one another.
In the human world objects had names, and names had meaning. There was something called a plate. It sat on a table, and a person ate food off it. One could teach a computer to recognize “plate.” It was flat, often white, usually round. Its edges were slightly curved. But how did a computer know that “plate” had a relationship with something called “silverware” that was actually a set of dissimilar-looking objects that for some reason seemed to pop up next to plate all the time, and always in a group? And what was this thing that looked like “plate” but was called “platter”?
Humans understood plates and silverware perfectly well, what they were used for and how they worked together. They knew why a knife wasn't a spoon, and when a plate was actually a platter. And they understood why those distinctions mattered. But a computer had to be taught all of this. It could not learn on its own. A machine had no experience, no residual memories. So the NSA would have to create them.
Computers needed this human logic layer. Without it the NSA could never achieve the kinds of early-warning insights Hayden had dreamed of, or Poindexter for that matter. The switch to in-memory computing was a legitimate breakthrough. But on its own it could not produce better analysis. The NSA might be able to swallow the ocean. But what good was that if it could never digest it?
Consumer marketers had been grappling with this problem for a generation. In their trade there was an old story, probably apocryphal, that seemed to illustrate the holy grail of insight that the NSA was reaching for. Clerks in a convenience store, the story went, noticed that men often bought beer at the same time they bought diapers. And the clerks noticed that they usually came in at night, alone. They started to wonder whether the men had been sent out by their wives for an emergency diaper run and decided they might as well pick up a six-pack for their trouble.
The store manager went through his receipts and confirmed that the clerks' observations were correct. Sales of beer and diapers were higher later at night, when men did the shopping. After dinner or later in the evening, sales of both products rose.
The store manager had just discovered a logic layer, the connection between two distinct objects. He started stocking diapers next to the beer. Sales of both skyrocketed.
At the most basic level, this was the NSA's quest. This was the end state of total information awareness. A set of rules, a pattern, that defined human behavior.
The richer the logic layer, the more patterns it could detect. And the more patterns, the more relationships. Did a man arrested for cocaine possession in Los Angeles have any connection to a suspected terrorist recently stopped at the Mexico border crossing? Did a man buying five hundred pounds of fertilizer need them for his gardening business or to build a bomb?
As the NSA forged ahead with in-memory databases its counterterrorism experts, as well as those in other agencies across the government, set out to try to answer these questions. They chased the same elusive dream as Poindexter's red team.
They had a slim chance of success. How could one account for the variances of human behavior? People were logical creatures—most of the time. But they often behaved illogically, and in ways that confounded explanation. Was there really a model for terrorism like there was for a hurricane, or a cold front, or the sales of beer and diapers? Detecting terrorism wasn't purely science. It was also an art.
Poindexter knew that. So did his critics. They vilified him for asking whether such a system could work. But at the NSA, Hayden and others were listening. And they quietly followed suit. They picked up where Poindexter had left off.
The result was chilling. Even without a logic layer, NSA's technological breakthrough meant the agency could see an entire network, and everything moving on it, in real time. They were one step closer to total information awareness.
CHAPTER 24
EXPOSED
 
 
 
 
In May 2004, Fran Townsend celebrated her first anniversary at the White House. She'd been the president's point person on terrorism, but always as a deputy to Condoleezza Rice, whom the president cherished not only as his national security adviser but as a personal friend. That immovable layer had separated Townsend from the commander in chief. But she was about to move up. That spring Bush tapped her as his assistant for counterterrorism and homeland security. She reported directly to him now.
Not long after Townsend moved into her new West Wing office, an NSA employee came to see her, someone she knew from her days working surveillance warrants at Justice. But this wasn't a social call. It was time to clear Townsend into the program.
Up to now she had only a notion that the NSA was working outside its customary boundaries. The first clue came from her old friend Jim Comey, the deputy attorney general. He approached Townsend at the White House during the crisis over the spying program's legality. Before a meeting in the Oval Office, which was also attended by Mike Hayden, Comey asked Townsend, who was still a deputy, if she had ever heard of the code name Stellar Wind.
Townsend said no, she hadn't, which Comey found deeply unsettling. The White House had, indeed, kept the circle tight, so tight that the president's terrorism adviser sat outside of it. Townsend had never seen her friend so ashen, so worried. She knew that Comey met with the president, but she didn't know what they discussed.
It became much clearer after the NSA employee brought Townsend into that tiny circle. After she'd been read in, Townsend had only one question about the program: “Has the Department of Justice said it's legal?”
Yes, the NSA employee replied.
That was good enough for her.
One of Townsend's new duties was playing intelligence traffic cop. She had to make sure that the information the NSA collected from the surveillance program made its way to the FBI. If the program was legal, then Townsend was less concerned about the details of what the NSA gathered, or how, than about what the agency did with that information. Were the leads getting passed on to the appropriate domestic law enforcement agency for follow-up?
Townsend also understood how important the program was to the intelligence community. That fact would be driven home repeatedly over the course of the next year, and partirularly when the new head of the NSA, Lieutenant General Keith Alexander, started showing up at the White House for daily briefings on what the agency was learning about terror networks.
 
Alexander took over at the agency in August 2005. His experience overseeing the Information Dominance Center and the intelligence command at Fort Belvoir made him a natural choice for the signals job. Alexander had deep technical expertise and long-standing contacts within the community. (He'd also attended those Genoa demos put on by Poindexter years earlier.)

Other books

Where Demons Fear to Tread by Stephanie Chong
Ashes of the Realm - Greyson's Revenge by Saxon Andrew, Derek Chido
Taming Damian by Jessica Wood
Young Eliot by Robert Crawford
Viaje alucinante by Isaac Asimov
Hell Fire by Aguirre, Ann
John by Niall Williams
Half Plus Seven by Dan Tyte


readsbookonline.com Copyright 2016 - 2024