Close ☰
Menu ☰

A new social contract for personal data

Posted on: Monday 29th of September 2014

This week marks the launch of Tim Berners-Lee’s new campaign for an internet ‘Magna Carta’. Speaking at the WebWeWant festival in London over the weekend, the ‘inventor of the web’ warned that the basic principles that originally defined the internet – such as being open and being free – are now being threatened by large corporations and governments who now see the internet as a potentially powerful weapon of control.

Exactly what this new Magna Carta should say is now being crowd-sourced through multiple consultations over the coming weeks and months.

Berners-Lee’s Magna Carta is not the only initiative now under way. MIT professor Sandy Pentland is proposing a ‘New Deal on Data’. In a big data world of ubiquitous data gathering “the first step towards open information markets is to give people ownership of their data,” he argues. By ‘ownership’ he means rights to possession, use and disposal.

Debate about the future of personal data is also being triggered by the EU’s proposed revised data protection regulations, and by initiatives such as the World Economic Forum’s ‘Rethinking Personal Data’ project.

A new social contract for personal data

Separately and together these initiatives are evidence that our society’s ‘social contract’ around personal data is being renegotiated. As new technologies tear old certainties apart, the norms, rules and tools which define what we think is acceptable and unacceptable, or fair and unfair, when it comes to personal data are being re-examined, from scratch.

Such a re-examination is well overdue, because currently many current debates are hampered by conceptual confusion.

What does ‘privacy’ actually mean?

Take the word ‘privacy’. What does it mean? Answer: pretty much anything, depending on who you are talking to.

For some, protecting privacy is about stopping mass surveillance by security services. For others, it’s about the way we are followed around on the web by ad tech companies, including Google and Facebook. For others, it’s the scary – and perhaps unfair – ways in which companies are scooping up data to build profiles which then determine how they treat customers: profiles that may be intrusive, and may also be flawed, leading to flawed decision.

Then there are data breaches. And small print issues – what terms and conditions and privacy policies say.

And so on.

Each of these issues helps define the agenda in a particular way. Is it about civil liberties, unfair or ‘creepy’ commercial arrangements, the security of the processes we use, or the legal niceties of our actions? How the problem is defined goes a long way to frame the solutions we look for.

For all these reasons, we think words like ‘privacy’ should be avoided wherever possible. They obscure and confuse more than they clarify or enlighten.

Clearing the air

So what do we need to do to move forward on personal data’s new Social Contract?

First, we need clarity: a clear and concise list of all the key issues and their possible solutions. Mass online surveillance by national security services and behaviourally targeted ads that follow you around online are different issues, requiring different solutions. It’s not helpful to lump them together under a blanket term such as ‘privacy’.

Second, we need to distinguish between ‘stopping bad things’ and ‘enabling good things’. The one doesn’t necessarily lead to the other.

The really big ‘good thing’ that could happen with personal data is that it becomes a tool in the hands of the individuals whose data it is.  Empowering individuals with their own data opens up a whole new landscape of opportunities for new types of service. It can not and will not  solve every problem. But only by embracing positive empowerment opportunities as well as  fixing negatives can a new social contract around data take root.