Close ☰
Menu ☰

Why is trust so hard to build?

Posted on: Monday 16th of November 2015

Consumer choice is becoming a key driver of change in the data-driven economy as consumers increasingly exercise a degree choice over how much data they share with which brand.

Consumers are increasingly ‘going dark’ on brands they don’t trust. Increasing numbers are opting out of consents and permissions, ‘dirtying’ the data they provide companies (i.e. lying), or installing new services such as ad blockers. Alternatively, if they wish, they have increasing means and ability to volunteer additional data if they want to (and this will increase massively under pending EU regulations on data portability).

The net result is that brands’ data relationship with customers is becoming a driver of competitive edge. Brands that have positive data relationships with customers have greater opportunity to access and use data to innovate and improve services, while those that have negative relationships risk being left out in the cold.

 

A new consensus?

So how can brands build these positive relationships? A consensus is now forming around what such a positive relationship looks like. If we take safety and security as a sine qua non, it boils down to building trust via three things: transparency, control and value. Key point: this takes us far beyond regulatory compliance to involve the entire business. For example, it involves:

  • Understanding customers’ expectations (which may not be the same as what rules and regulations permit)
  • Communicating and demonstrating benign intentions (which includes the content of data policies and terms and conditions, communication content and style, plus the processes and infrastructure to walk the talk)
  • Demonstrating value via services and experiences where customers can see the benefit of sharing data.

 

Beware superficial solutions

Trouble is, transparency, control and value are fine as vague ideals and buzzwords, but very hard to implement in practice.

  • Transparency can be a positive turn-off for customers, especially when they discover things they find disconcerting. It also begs questions such as: how much transparency, about what things, using what mechanisms and channels? Transparency can also impose an unwelcome cognitive load on users, who have to get their heads around what is being communicated.
  • Likewise with control. Done the wrong way, it could result in even more cognitive load plus additional administrative hassle. And what about the details? How much control, over what exactly, using what processes? What is the trade-off between the ideal of control and a seamless, easy user experience?
  • With value, much is made of the supposed ‘value exchange’ of, say, consumers providing data for free services. But do customers and providers really see eye to eye on this value exchange? Is today’s value exchange ‘fair’ – and who decides what ‘fair’ looks like? When consumers hand over data, is it really a value exchange or is it really, as some recent research suggests, just resignation?

In short, badly thought through and/or badly executed attempts to build trust around personal data can be counter-productive. On the other hand, there are powerful, innovative answers to these questions, most of which revolve around unlocking new value by recognising that data has value to individuals as well as organisations.

Well-designed services and tools that work on this basis address the ‘problems’ of transparency and control almost as a natural by-product of how they operate.

But it is a steep learning curve. That’s one of the purposes of our Growth Through Trust conference on December 8. It creates a forum where pioneers and peers can share their learnings and help chart a way forward that really does work in practice.