The Latest in IT Security

Edward Snowden used automated web search tools to collect NSA data

09
Feb
2014

Its tempting to imagine that Edward Snowden obtained NSA data through a daring Mission Impossible-style raid, but it now appears that he didnt have to put in much effort. Intelligence officials speaking to the New York Times say that Snowden used a standard web crawler, a tool that typically indexes websites for search engines, to automatically collect the info he wanted. He only needed the right logins to bypass what internal defenses were in place.

Comments are closed.

Categories

FRIDAY, APRIL 26, 2024
WHITE PAPERS

Mission-Critical Broadband – Why Governments Should Partner with Commercial Operators:
Many governments embrace mobile network operator (MNO) networks as ...

ARA at Scale: How to Choose a Solution That Grows With Your Needs:
Application release automation (ARA) tools enable best practices in...

The Multi-Model Database:
Part of the “new normal” where data and cloud applications are ...

Featured

Archives

Latest Comments