Posted on: Monday 15th of December 2014
Interesting conversation with a regulator this week about developments in the personal data market. The exam question: what should regulators be keeping an eye on/scrutinising in today’s fast morphing personal data landscape?
Here are three mega issues that will continue to plague us until resolved.
The problem of consent
Current approaches to consent were supposed to empower consumers by given them choice, but their real effect have been the opposite.
When we buy an item in a physical store we are not presented with a long and complicated contract laying out the details of all the ingredients used and requiring us to sign that we have agreed with these uses. Yet that’s effectively what happens when companies present consumers with long and complicated terms and conditions and privacy policies online: privacy policies that they should not be expected to understand.
When we exchange money for an item in a physical store we are indeed ‘signing’ a ‘contract’, but when we do so both we and the shopkeeper know we are protected by a raft of rules and regulations relating to trades descriptions, weights and measures, safety rules and so on. We don’t have to negotiate a lengthy contract because a standard and standardised set of rules apply, making us confident that we can do business without untoward risk.
Yet when we do business online, when it comes to our personal data, almost anything can go – because of what is written into the terms and conditions that we have to agree to, and which we don’t read, don’t understand and which we cannot change anyway. A process originally designed to protect us and give us choice, has ended up giving legal protection to unscrupulous data operators who can always retort “there is nothing you can do. You agreed to these terms.” This has created a race to the bottom where unscrupulous players have unfair competitive advantage over companies that do wish to treat their customers fairly, and which choose not to take advantage of customers’ failure to read the small print,
There is a simple way for regulators to address this problem. When consumers do business with a company, they should ‘just know’ their data is safe, just as they ‘just know’ they are protected when they buy something in a physical store.
Practically speaking, this means they should not have to consent to anything. Standard rules should apply, de facto. They should ‘just know’ that any data they provide will only be used to provide the service they asked for, and for no other purpose; that this data won’t be sold or rented to unknown third parties; will only be kept for as long as needed, and so on. And they should ‘just know’ that any companies not keeping to these rules will be punished in law – with regulations enforced rigorously.
For any other, additional use, consent should be required, and this negotiation should be kept completely separate from the provision of the service in question. Consumers should not have to give up extra data or extra permissions in order to access the service they are buying.
The algorithm economy
A data economy quickly becomes an algorithm economy where an increasing range and number of decisions and actions are carried out via automated if-then rules engines driven by data inputs. This is a huge opportunity. But it also brings huge risks because algorithms are a potential source of unaccountable power: he who specifies the rules that go into the algorithm makes all the important decisions. Yet if these rules remain a secret, this secrecy becomes an open invitation to abuse.
This is an incredibly difficult area to regulate but the underlying principle should be clear. The algorithms organisations write should be open to mechanisms of public scrutiny and accountability.
When push comes to shove, much recent rhetoric about Big Data comes close to incitement to break data protection laws. This Big Data rhetoric is all about data maximisation – collect as much data as you can and hold it as long as you can – because the more data you have the more value you can mine from it. This flies in the face of EU principles of data minimisation, discussed above.
Some people argue that these data protection regulations are out of date and now need to be set aside because the economic opportunities of Big Data – of data maximisation – are so great. Trouble is, the privacy and concentration of power implications of this approach to Big Data are truly frightening.
For regulators, there is a simple way through this quagmire. Big Data is great for statistical analysis of aggregated data sets to identify trends, patterns and so on. Organisations should be free to do such Big Data analyses so long as they are not using personally identifiable information and there is no ‘return path’ back to an identifiable individual. Sure, there may be some exceptions to this rule (in cases of medical research, for example) but these should be surrounded by rigorous regulatory safeguards.
The principle is quite simple. If it is to do with personal data, data protection regulations apply. If it is statistical and aggregated Big Data not using personally identifiable information, organisations should be free to mine away. Just don’t try mixing the two.
The bigger picture.
One common theme unites these three mega-issues. Today’s policies and practices all rest on the organisation-centric assumption: that large organisations are the only entities collecting and using personal data, and that only large organisations can generate value from this data. In such a world, there is in-built and endemic tension between the requirements of privacy and the requirements of innovation and growth.
But the world is changing. As soon as we realise that personal data can also be a personal asset; that individuals collecting and using their own data can generate their own value from it, then we can see new – win-win – solutions to yesterday’s intractable impasses. Once individuals as well as organisation can participate in the data economy, the potential benefits are magnified, for all concerned – organisations and individuals.
How can regulators help? First, by simply recognising and accepting this shift. Empowering individuals with their own data transforms the situation. Second, by positively encouraging this empowerment by, for example, spreading midata principles and practice (where organisations release data back to individuals so they can use this data for their own purposes) to every industry.