"Have nothing in your houses that you do not know to be useful or believe to be beautiful." - William Morris

Big Brother’s Data Aquisition Filters

Posted: June 19th, 2008 | Author: | Filed under: Privacy & Security | Tags: , , , | No Comments »

Cory Doctorow has another interesting piece in the Guardian today about how Britain is collecting too much surveillance data to be useful. His argument is that collecting information about everything and trying to create predictions, be it about weather forecasts or terrorist events, will not lead to success due mainly to the computing power required to process such volumes of information.

Although I agree with his stance about their being too much surveillance in British society, something that could well be damaging our population as a whole by removing our need to have personal responsibility for our actions, I disagree with his overall argument. Yes, collecting the movement of every butterfly in the world to predict the weather would be ridiculous and impossible to deal with, but in the world of more directed, human activities information can be much more revealing. Right now we may not know what behaviours indicate a potential attack, but over time with the right data and good analysis it is conceivable that these behaviours can be isolated and hence our limited Police resources can be deployed more effectively.

We’re talking about artificial augmentation of our senses here. Right now, watching 1000s of CCTV cameras is ineffective on the whole at prevention, but potentially automated matching technology could draw attention to those looking more suspicious, to a level a computer can recognise, and then a human can do the final filtering. No way can a team, however big, watch every person in London. In the same way that no person could do what Google does every time we do a search.

Augmentation and filtering based on our needs is a constantly expanding field as we go past the information age and into the ‘Knowledge Age’, whether for security or personal goals. Having a large pool of, ideally non-identifying, data to work on to create these filters is fundamental to success. Although as each level of filter is deployed and improved, then all that will happen is someone will game the system and find a way round – for comparison, spam blogs didn’t exist a few years ago, and now they are everywhere. That said, should we be trying to create an all seeing ‘Big Brother’ computer system that watches us at all time? That’s an ethical and moral question that is harder to answer.