The nature of privacy in the modern world is very much in the public mind thanks to the Leveson Inquiry over alleged phone hacking by unscrupulous journalists. While many of us may not have been over-exercised by the mechanisms by which we were kept up to date with Sienna Miller’s love life, we soon realised the reality that the Milly Dowler revelations exposed. With more and more aspects of our lives being digitised, and our obsession with our various gadgets increasing, the more invasive the ability to distil information from them has become. SUBSCRIBE TO OUR NEWSLETTER From our editors straight to your inbox Get started by entering your email address below. Please enter a valid email address Subscribe Until recently, the rate at which this data is generated has far outstripped our ability to process it. The fact that there are around half a million CCTV cameras in London alone has worried privacy campaigners for years, but we haven’t even begun to see the implications of this. Most of these cameras have had nobody to watch them. Billions of hours of footage are created but never viewed. But this is going to change. As computers increasingly understand the world around them, they can look for notable images or sounds, or specific patterns of behaviour. On one hand this is a rather exciting thing, as it lets us harness the vast majority of previously untapped data. CCTV on the Underground can pre-emptively spot likely suicide jumpers. Cameras on the high street could detect the sound of a shop window breaking and call the police. But the ability of computers to understand human information represents something of a double-ended sword, as the ability to do something without it being known will become increasingly difficult. This is even more significant when we consider the proliferation of the devices we carry around with us. Our phones and tablets can work out exactly where they (and, by extension, you) are, but the big breakthrough is they can now understand what they see and hear. A team of Computer Scientists at Georgia Tech allegedly wrote a program that could use the accelerometers in a smartphone sitting on a desk to track the keystrokes on a nearby computer and capture usernames, passwords, bank details — anything that is entered on the computer’s keyboard. Those of you reading this at your desks may now be giving your phones suspicious glances. The real change comes when we harness the ability of machines to understand social technologies. With the increasing power and convenience of our devices and the powerful network effect of social media, such surveillance would not require the excessive resources of a government or corporation, but could be carried out by any arbitrary group of people. All a group of like-minded individuals would have to do would be to each download an app to their smartphone so that the camera in their phones reads all the licence plates of passing cars on the street, and suddenly they can effectively track people across the city. If they use social networking tools to coordinate with similar groups, they are suddenly able to track people across the country. Imagine holding a smartphone up to anyone entering a building and knowing who they are. Now, a little transparency can be a very good thing, but it’s also important to realise that the ability to hide things from each other is fairly critical to the smooth functioning of human society. Think about the relative peace achieved in Northern Ireland over the last decade: such an agreement could not have happened if people had known that both parties were in talks together. It’s one of the many cases in which a great deal of good has come from that very lack of transparency. To solve some of the problems in the world, we need be able to get things done behind the scenes. I cannot see how privacy can survive the growing ability of machines to understand the data they’re getting. But we’re not going to throw out the devices that we rely on overnight. What has to happen is for the concept of privacy to alter. Most of us, at various levels, do things that we know we’re not supposed to and so the idea of judging someone for doing this has got to change. Improving the ability of our machines to understand the human world is empowering, and there is huge potential to do so much more with what we already have. But this empowerment cuts both ways, and we have to be prepared for that. In an age where everyone has a smartphone and any group connects via social media, perhaps we have more to fear from Little Brother than Big Brother. Related content Podcasts Videos Resources Events SUBSCRIBE TO OUR NEWSLETTER From our editors straight to your inbox Get started by entering your email address below. Please enter a valid email address Subscribe