Those Pesky Algorithms

front cover 02

Rod Rees writes: In researching for my book Invent-10n it quickly became apparent that it wasn’t the surveillance side of State intervention in our lives – the use of cameras and digital-communication intercepts to collect data about us – that we should be worried about but the use that is made of that data. And here we enter the decidedly creepy world of the algorithm.

I guess (and it is just a guess) most people if they think about surveillance at all see the spread of CCTV cameras and the like as really quite benign. This is reflected in the slogan used by the fictional National Protection Agency in Invent-10n is that its surveillance paraphernalia is to enable them to ‘watch out for the good guys by watching out for the bad guys’, the idea being to emphasise the ‘watching’ aspect of surveillance. But that’s not the most important aspect … that’s the ‘collecting, storage and analysis’ part. Surveillance captures simply HUGE amounts of data, and to store this security agencies are spending a fortune. The Utah Data Center built for the National Security Agency has been dubbed ‘the second Manhattan Project’ which gives some idea as to both its importance and its cost. James Bamford in his article for Wired magazine ‘The NSA Is Building the Country’s Biggest Spy Center (Watch What You Say)’ (Wired 15th March, 2012) advises:

‘Flowing through its servers and routers and stored in near-bottomless databases will be all forms of communication, including the complete contents of private emails, cell phone calls, and Google searches, as well as all sorts of personal data trails—parking receipts, travel itineraries, bookstore purchases, and other digital “pocket litter.”’

Now why, you might ask, are they – and the security agencies of many other countries – going to all this trouble, and the answer lies in algorithms. Algorithms are basically enormously complex decision trees which can be used to solve breath-takingly difficult problems by breaking down these problems into a long string of binary choices. They operate much like the neurons powering our brain which is a good analogy given that they have become so damned sophisticated that they can now imitate thought processes. This combined with an ability to process and analyse a prodigious amount of data means that they can find associations between seemingly disconnected pieces of the chaff of human existence (like that being collected by the Utah Data Center) and by doing so come to some very accurate and very disturbing conclusions. Which is why algorithms (and the computers that platform them) have such a voracious appetite for information (and why the surveillance systems just keep on growing).

Algorithms have been with us a while now, their use being especially prevalent in the world of banking and finance where their ability to grunt in real-time through an amazing amount of data makes them par excellent at discerning trends and making correct buy/sell decisions. In the US the ‘high-frequency trading’ firms utilising algorithms account for at least half of equity trading volume. But it isn’t just in finance where their making their presence felt. They’re being increasingly used as diagnostic helpmates in medicine, in the interview process (remember that funny on-line test your company made you take?), in traffic management, in the optimising of the deliveries the Tescos et al make to their supermarkets and in the area of law-enforcement.

This latter area I find particularly interesting as it gives (I believe) an indication of the shape of things to come. In a terrific article (‘Penal Code’ New Scientist, 7th September, 2013) Katia Moskvitch opines that automated, algorithmic-directed policing ‘could lead to a world akin of Kaka’s novel The Trial, in which a man stands accused but has no opportunity to defend himself’. Worse, it is a world akin to that envisaged by Philip K. Dick in The Minority Report: a world beset by predictive law-enforcement.

For a security service the crock of gold at the end of an algorithmic rainbow is the ability to predict what people will do (and not just terrorists, the guys who pay the salaries of those working in GCHQ and NSA – the politicians – are more than a little interested in monitoring the mindset of the whole community). A la Gottfried Leibniz, those designing and operating the computers that run the security-orientated data-mining systems and the algorithms that direct them believe that future of human beings can be predicted by the forensic examination of the minutiae of our lives. By knowing (and analysing) a person’s DNA, the details of their upbringing, what they say, what they read and listen to, how they think, who they talk with … all this makes easy to predict what exactly they’ll be up to in the future … and how those they interact with will act.

I have to admit that I found it difficult to get my head around a machine predicting what I would do in any given circumstance having always believed that human actions, being so whimsical and driven by emotion, as impossible to predict. But non-linear or not given enough data (hence the growth of the surveillance culture) it can be done. And don’t think this is something for the future. Bruce Bueno de Mesquita uses algorithmic analysis combined with game theory as a template to predict political events (the development/non-development of the Iranian H-bomb is the one most usually cited) and, apparently, he’s been so successful he now advises the US government on international policy.

There’s a great line in Ian Ayres’ book ‘Super Crunchers’ (highly recommended, a fascinating read) where he says ‘We may have free will, but data mining can let business emulate a kind of aggregate omniscience’. I would only add that for ‘business’ we should now read ‘security services’.

As Jenni-Fur says in Invent-10n: ‘Then all of us will be reduced to remote-controlled puppets, and there will be no chance of being able to zig when they say zag, or to beep when they say bop. Post-Patriot we will not be able to think, to act, to speak or to move without the spirit-sapping realisation that the ChumBots know everything’.

Advertisements
This entry was posted in Invent10n and tagged , , , , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s