Edward Snowden used automated web search tools to collect NSA data

Edward Snowden

It’s tempting to imagine that Edward Snowden obtained NSA data through a daring Mission Impossible-style raid, but it now appears that he didn’t have to put in much effort.

Intelligence officials speaking to the New York Times say that Snowden used a standard web crawler, a tool that typically indexes websites for search engines, to automatically collect the info he wanted. He only needed the right logins to bypass what internal defenses were in place.

Since the NSA wasn’t walling off content to prevent theft by insiders, the crawler could collect seemingly anything — and Snowden’s Hawaii bureau didn’t have activity monitors that would have caught his bot in the act.

Whether or not you believe the NSA’s intelligence gathering policies justified a leak, it’s clear that the agency was partly to blame for its own misfortune.

 

Source: Engadget - Read the full article here

Author: Daily Tech Whip

This article is part of our 'News Tiles' service. The site is currently in Beta. When it is fully operational you will be able to search through and arrange the 'Tiles' to display a keyword, product or technology over your chosen time period. For example you would be able to display all of the leading tech articles on the new Kindle Fire, in one spot in real time. You will also have access to our own original reporting and analysis as well as a polished place to post your own thoughts & reviews here, amongst the Daily Tech Whip Community. Please let us know if you have any feedback via the contact form or via Twitter. Don't forget to come back next week and see our full site and claim your name and your own free tech blog.

Share This Post On