signal would be intercepted by ground [antennas], by the aircraft network and by the space network. If youâre smart enough to combine all that data in real time, you can determine where Dick is out there. Heâs in block 23 down there, and he just said heâs going to place a bomb. . . . The information from those three devices come[s] into a location where somebody can actually say action is needed, and the tank or the truck or the warfighters [are] right here in this location. Heâs a colonel, and he can say, âWe have verification that this bad guy is in this location: Go and get him.â
Â
The RTRG was unique for the way it brought together not only intelligence but peopleâthe top levels of the military brass and the intelligence community, the brightest minds from across government, and the expertise of private industry. It was a rare example of successful collaboration within the byzantine federal bureaucracy.
The NSA got so good at managing big dataâhuge data, reallyâby abandoning its traditional approaches. Rather than trying to store all the information in the RTRG in central databases and analyze it with supercomputers, the agency tapped into the emerging power of distributed computing. Silicon Valley entrepreneurs had developed software that broke big data sets into smaller, manageable pieces and farmed each one out to a separate computer. Now the burden of analyzing huge data sets didnât rest on one machine. Working together, the computers could accomplish tasks faster and cheaper than if one central machine took on the workload. This revolution in data management is what allowed Facebook, Twitter, and Google to manage their own data stores, which were growing exponentially by the late 2000s. NSA used the same distributed computing technology for the RTRG. The system was like Google not only on the front end but on the back end as well. In fact, the NSA later developed its own distributed computer software, called Accumulo, based on technology from Google.
But the collection of huge amounts of electronic data by the NSA had proven controversial before. In the spring of 2004 the Justice Departmentâs Office of Legal Counsel reviewed the program and found that one method of collection in particular was illegal under current law. It had to do with the bulk collection of so-called Internet metadata, including information about the sender and recipients of e-mails. The NSA thought since President Bushâs order allowed them to search for keywords and other selectors in Internet metadata, it also implicitly authorized the bulk collection of that data. In the view of the agencyâs lawyers and its director, Michael Hayden, no one had âacquiredâ the information until it was actually looked at. A computer gathering up the data and storing it didnât count as acquisition under the law, and it certainly didnât meet the agencyâs definition of âspying.â
When the president went ahead and reauthorized the program over the Justice Departmentâs objections, senior officials in the department threatened to resign, including the head of the Office of Legal Counsel, Jack Goldsmith; the director of the FBI, Robert Mueller; and the attorney general, John Ashcroft, along with his deputy, Jim Comey, whom President Obama would later choose for Muellerâs replacement as head of the FBI.
The threat of mass resignation was a unique moment in the history of the Bush presidency. Had they stepped down, their reasons would eventually become known through press leaks and congressional inquiries. The American people would have discovered not only the existence of a domestic spying program but that top law enforcement officials had resigned because they thought a part of it was illegal.
But for all the high drama surrounding the Internet metadata collection program, it turned out to be only a momentary hiccup in NSAâs insatiable