News Technology

A quick explanation of GCHQ and their Internet Surveillance

GCHQSince the story of how government agencies have been tracking information through major sources of technology came out, privacy has become a major issue on the web. While the entire world has been in a frenzy, the United States and United Kingdom, the two regions most affected, have been especially concerned. Most worrying is the regular mining of data, such as through the Government Communications Headquarters (GCHQ).

For those who are not from the UK, or just don’t know what it is, the GCHQ is the branch of government within Britain that is responsible for gathering intelligence for both the government and military projects. All of this is done through communication and information networks, a big one being the Internet itself.

Working in association with the United States through the National Security Administration, they use what is called an internet buffer to filter through mass amounts of information. Quite a large selection of communications is intercepted this way, most of it passed along. But a small bit of this data collection is kept, and stored between three and thirty days depending on the nature of the content. Metadata, or the information surrounding content like file names and size,  is stored the longest.

This is a massive undertaking, and shows the reach that the GCHQ has managed to achieve. Already there are 200 fiber optic cables that their filtering software is attached to. They hope to expand this to 400 cables in the future. Meaning, they already have access to an enormous amount of information, managing to filter through a large chunk of all daily internet traffic within the country. Doubling it would bump this up to a quarter of all traffic.

But why is this so worrying? After all, they have to have a good reason to actually look at the data they collect, don’t they? In theory, yes, and according to reports the agents at both the GCHQ and NSA are required to justify their reasoning each time they take a look at or store either content or metadata.

What is frightening is that there is no actual warrant needed for these searches. That is what a lot of the controversy over surveillance has been about, after it became known that agents need only be “51% sure that the source of the data is foreign” to make it a security issue. Presumably, the same standards are being held here, and that is definitely not alright.

Source: TheGuardian

Leave a Comment

Your email address will not be published. Required fields are marked *

*

This site uses Akismet to reduce spam. Learn how your comment data is processed.