He further added that the collection process was ‘quite automated’ and the software did the ‘search, index, and backup’ of files while Snowden went about his daily routine.
- We do not believe this was an individual sitting at a machine and downloading this much material in sequence,” the official said.
He only needed the right logins to bypass the internal defenses in place and collect the data automatically.
Snowden was able to access 1.7 million files including internal NSA networking documents and internal ‘wiki’ materials by setting the right algorithm for the web crawler with appropriate subjects and the extent to which the links should be followed, the report said.
The web crawler Snowden used was similar to the one used by Google to access and download contents of billions of websites for faster search results.
The not so highly sophisticated attacks could have been easily detected by special monitors in place at other NSA headquarters like Fort Meade, but the Hawaii outpost, where he was based as the technology contractor, was not upgraded with the latest security measures.
But that does not mean that his activities went unnoticed. But he was able to ward off criticisms tactfully. The report noted,
- In at least one instance when he was questioned, Mr. Snowden provided what were later described to investigators as legitimate-sounding explanations for his activities: As a systems administrator he was responsible for conducting routine network maintenance. That could include backing up the computer systems and moving information to local servers, investigators were told.
Currently, Snowden is a fugitive staying in Moscow, Russia, where is granted political asylum. His leaks have highlighted several unreported NSA and its British counterpart GCHQ operations that have created outrage among civilians as well as affected American international relations with its allies.