The NSA will not be able to determine what Mr. Snowden took from its various databases according to a New York Times article.
“Investigators remain in the dark about the extent of the data breach partly because the N.S.A. facility in Hawaii where Mr. Snowden worked — unlike other N.S.A. facilities — was not equipped with up-to-date software that allows the spy agency to monitor which corners of its vast computer landscape its employees are navigating at any given time.”
The ability to determine what data was accessed and removed during a breach is critical to understanding what your liability is, communicating to your stakeholders with confidence, dealing with media and customer inquiries, etc. There is a lot of technology available to assist with monitoring. Most operating systems have a fair amount of logging options, but they are turned off by default and often left because of short-sited concerns about the cost of storage.
Snowden is probably a relatively smart fellow, but I don’t think he’s as much of an expert as the government makes him out to be.
That Mr. Snowden was so expertly able to exploit blind spots in the systems of America’s most secretive spy agency illustrates how far computer security still lagged years after President Obama ordered standards tightened after the WikiLeaks revelations of 2010.
The key phrasing in the above quote is: “. . . how far computer security still lagged . . .”
The government is large and quite complex. Choosing, testing, and rolling out software is no small task. It’s just as unacceptable to “break” the NSA by rolling out new software as it is for the NSA to allow unfettered access to everything. But PFC Manning was discovered in November 2010. Snowden started downloading secret information no later than November 2012 (very possibly earlier). That is a two year window in which improvements could have been made, particularly at what is one of they highest risk agencies in the world- the NSA.