(Over)sharing is caring

The Church of Latter Day Saints holds the view that we are all descended from Adam, one way or another. This belief has turned Mormons into avid genealogists, and therefore keen innovators, at the forefront of technology to help people trace their ancestors. As far back as 1996 for example, the church was already organising DNA testing of its members. At one such initiative, in the state of Mississippi, a willing donor was the father of film-maker Michael Usry.

Some years later, the church chose to sell its database of genetic markers to the genealogy web site Ancestry.com, which later opened up access to the data to the general public. And, it transpired, to law enforcement agencies on the trail of a 1998 murder inquiry. Whether or not their intent was pure, the consequence1 for Michael Usry was to be held for 33 days as a potential suspect in the case. Due to this, and other such situations, public access to the data was removed. “The data was used for reasons that were not intended,” said the site. All the same, the terms and conditions of many such sites still allow for sub-poenaed access by law enforcement agencies.

A corollary to the Law of Diminishing Thresholds is the Law of Unintended Consequences. Michael Usry’s father had no idea how his data might be used, nor of the technological advances that would make such a DNA comparison possible, nor how law enforcement would still act upon inaccurate information. But the genie was already out of the bottle. As mentions the policies of UK testing site BritainsDNA, "Once you get any part of your genetic information, it cannot be taken back.” It’s not just investigators we need to worry about; even more important are ourselves, and how we might act given new information about our heritage or our health. Says BritainsDNA, “You may learn information about yourself that you do not anticipate. This information may evoke strong emotions and have the potential to alter your life and worldview. You may discover things about yourself that trouble you and that you may not have the ability to control or change (e.g., surprising facts related to your ancestry).”

Perhaps, like Julie M. Green2, you would rather know that you had a risk of a degenerative condition. Or perhaps, like Alasdair Palmer3, you would not. But you may not know the answer to this question in advance. As indeed, you might want to think through the consequences of discovering that the person you have known for 30 years as your father turns out not to be. It’s not hard to imagine the potential for anguish, not indeed the possibility of being cut out of a will or causing a marital break-up of the people you thought of as parents.

Despite examples such as this, we find it impossible to stop sharing our information. The majority of consumerist Westerners will have ‘form’ in giving up data to purveyors of financial services and consumer products, for example. Few people systematically erase their financial and spending trails as they go — pay by cash, withhold an address, check the ‘no marketing’ box and so on. In many cases we accept recompense for giving up elements of our privacy, such as with retail loyalty cards. We know that we and our shopping habits are being scrutinised, like “transient creatures that swarm and multiply in a drop of water.” But, to our delight, we receive points, or vouchers, without worrying whether we've got a decent return on our investment.

Social Networking is also enticing, but we know it comes the expense of personal privacy. We share our stats, personal views and habits via Facebook, Google and Twitter, deliberately blasé about how the information is being interpreted and used by advertisers. “If you’re not paying, you are the product,” goes the often-quoted, but generally ignored adage. This is despite the evidence: when Facebook launched its Graph Search algorithm, launched on January 15th 2015, sites sprang up4 to demonstrate how you can hunt for Tesco employees who like horses, or Italian Catholic mothers who like condoms. Facebook is now embedded in the majority of sites we use: when we log in to a third-party site using a Facebook login to avoid all that rigmarole involved in remembering usernames and passwords, we have entirely given over any rights we might have had on the data, or conclusions that cold be drawn from it.
However clunky today’s algorithms appear to be, every purchase is being logged, filed and catalogued. The Internet of things is making it worse: for example, we will confirm our sleep patterns or monitor our heart rates using checking our Fitbits or Jawbones, uploading data via mobile apps to servers somewhere in the globe, in the cloud. Of course, what is there to be read from sleep patterns? It’s not as if we’re talking about drinking habits or driving skills, is it?

Every act, every click and even every hover over a picture or video results in a few more bytes of data being logged about us and our habits. It seems such a small difference between buying a paperback from Amazon or paying in cash for the same book from the local shop; but the purchase of one will remain forever, indelibly associated with your name. And even our disagreements are logged: “No, thank you” responses are stored against our identities, or if not, our machines, or web browser identifiers. And what about other mechanisms less scrupulous advertisers use to identify computer users, such as AddThis5, which draws a picture on your screen and uses this to fingerprint you? Even now, mechanisms are being developed which look to stay the right side of the increasingly weak legal frameworks we have, all the while slurping as much data as they can about us.

Outside of the consumer world, today’s technology advances inevitably result in whole new methods of surveillance. London has become the surveillance capital of the world, according to CCTV figures. At the other end of the spectrum are ourselves, of course: today we carrying around powerful recording and sensor-laden devices, in the form of smartphones and, increasingly, smart watches.

Right now, our culture is evolving towards a state where our actions and behaviours are increasingly documented, by individuals and , institutions and corporations, all of whom are thinking about how to make the most of these pools of data. They’re all at it — any organisation that has access to information is trying to get more of it, whether or not they know what to do with it yet. For example, consider the announcement that both Visa and MasterCard are looking to sell customer data to advertising companies. Data brokerage is becoming big business. For the time being, simple transfer of data — data brokerage — is becoming big business. Even public bodies are getting in on the act, witness the selling of data by councils, the UK vehicle licensing authority and indeed hospitals — at least in trial

How’s it all being funded? Enter online advertising, itself subject to regulation in various forms including the DPA. It is unlikely that the Web would exist in its current form without the monies derived from advertising, from click-throughs and mouse-overs, to ‘remarketing’ with tracking cookies and web beacons. Advertising is the new military or porn industry, pushing the boundaries of innovation and achieving great things, despite a majority saying they would rather it wasn't there.

Corporate use of data frequently sails close to the wind, as they are not necessarily acting in the interests of the people whose data they are collecting. We’re seeing examples of malpractice, even if within the letter of the law. In May 2014 data brokers such as Acxiom and Corelogic were taken to task6 by the US Federal Trade Commission for their data gathering zeal and lack of transparency. many examples demonstrate. Every now and then, an alarm bell goes off. For example, while the Livescribe pen has been around for a good few years, you've got to hand it to whoever decided to use 'spy-pen' to describe the device that led to the resignation of the chairman of a Scottish college. The term has everything: popular relevance, gadget credibility and just that frisson of edgy uncertainty. The trouble is, the device at the centre of the controversy is no such thing. Yes, it can act as a notes and audio-capture device, in conjunction with special sheets of paper. But calling it a spy-pen is tantamount to calling the average tablet device a spy-pad. "It's quite a clunky kind of thing — not the sort of thing you can use without folk knowing," said Kirk Ramsay, the chairman in question, to The Scotsman. “I have had it for three and a half to four years — you can buy it on Amazon.”

Fortunately, because of the coastline paradox, they have largely been unable to really get to the bottom of the data they hold. For now. We are accumulating so much information — none of it is being thrown away — about that many topics, that the issue becomes less and less about our own digital footprints, however carelessly left.

Looming without shape and form — yet — are the digital shadows cast by the analysis of such vast pools of data. A greater challenge is aggregation, a.k.a. the ability to draw together data from multiple sources and reach a certain conclusion. Profiling and other analysis techniques are being used by marketers and governments as well as in health, economic and demographic research fields. The point is we don’t yet know what insights these may bring, nor whether they might be fantastically good for the race nor downright scary. The kinds of data that can be processed, such as facial, location and sentiment information, may reveal more than people intended, or indeed ever expected. All might have been OK if it wasn’t for the Law of Unexpected Consequences. Purposes change, and so do businesses, and as we have seen, isn't all that easy to map original intentions against new possibilities. For example, what if family history information could be mined to determine, and even predict causes of death? What would your insurer do with such information? Or your housing association? Or your travel agent?

And that’s just taking the ‘good guys’ into account; every now and then, however, someone sees sense in breaking down the barriers to private information for reasons legion. Consider for example when web site AshleyMadison was hacked in July 2015, despite being called “the last truly secure space on the Internet,” according to an email sent to blogger Robert Scoble. The reason: because the site was seen as immoral. Which it most certainly will be, to some. As a broader level, this example pitted data transparency against some of our oldest behavioural traits, which themselves rely on keeping secrets. Love affairs are possibly one of the oldest of our behaviours, their very nature filled with contradictions and a spectrum of moral judgements. Whatever the rights and wrongs, rare would be the relationship that continued, start to finish, with absolute certainty or a glance elsewhere. But the leaky vessel that is technology leaves gaping holes open to so-called ‘hacktivists’. The consequences of the affair continued to unfold for several months — the first lawyers were instructed, the first big names knocked off their pedestals, not only husbands contacted but wives7

And consider the fitness device being used in court as evidence, or the increasing use of facial recognition. Soon it will be impossible to deny a visit to a certain bar, or indeed, say “I was at home all the time."As Autonomy founder Mike Lynch remarked, “In an age of perfect information… we're going to have to deal with fundamental human trait, hypocrisy.”

Even if we have not been up to such high jinks, our online identity may be very difficult to dispense with. And meanwhile our governance frameworks and psychological framing mechanisms are falling ever further behind. There was a certain gentle-person's agreement that was in place at the outset, given that nobody ever reads the T's and C's of these things. Namely: that the data would only be used for the purpose it was originally intended. Indeed, such principles are enshrined in laws such as the UK Data Protection Act (DPA).

Ongoing initiatives are fragmented and dispersed across sectors, geographies and types of institution. The UK’s Data Protection Act, international laws around cybercrime, even areas such as intellectual property and 'digital rights’ were all created in an age when digital information was something separate to everything else. Meanwhile the UN's resolution on “The right to privacy in the digital age” overlaps with proposed amendments to the US 'Do Not Track’ laws, as well as Europe's proposed 'right to be forgotten' (which has already evolved into a 'right to erasure’) rules. All suggest that such an option is even possible, even as it becomes increasingly difficult to hide. In such a fast-changing environment, framing the broader issues of data protection is hugely complicated; the complicity of data subjects, their friends and colleagues is only one aspect of our current journey into the unknown. Celebrities are already feeling this — as all-too-frequent examples of online and offline stalking show. But how precisely can you ensure that your record are wiped, and is it actually beneficial to do so?

Perhaps the biggest issue about the data protection law is that it still treats data in a one-dot-zero way — it is the perfect protection against the challenges we all faced ten years ago. Even as the need for legislation is debated (and there are no clear answers), organisations such as the Information Commissioner’s Office are relaxing8 the rules for ‘anonymised’ information sharing — in fairness, they have little choice. While we can comforted that we live in a democracy which has the power to create laws to control any significant breaches of privacy or rights, it won’t be long before the data mountains around us can be mined on an industrial scale. Over the coming decades, we will discover things about ourselves and our environments that will beggar belief, and which will have an unimaginably profound impact on our existence. The trouble is, we don’t know what they are yet, so it seems impossible to legislate for them.

What we can know is that protecting data is not simply “not enough” — in a world where anything can be known about anyone and anything, we need to focus attention away from the data itself and towards the implications of living in an age of transparency. Consider, for example, the experience9 of people who happened to be located near to the 2014 uprisings in Kiev, the capital of Ukraine. Based on the locations of mobile phones, nearby citizens were sent text messages which simply said, “Dear subscriber, you are registered as a participant in a mass riot.” The local telephone company, MTS denied all involvement. Such potential breaches of personal rights may already be covered under international law; if they are not, now could be a good moment to start treating them.

Sun Microsystems founder and CEO Scott McNealy once famously said, “Privacy is dead, deal with it.” Privacy may not be dead, but it is evolving. There are upsides, downsides and dark sides of living in a society where nothing can be hidden. Before we start to look at where things are going, let’s consider some of the (still) darker aspects.