The privacy pact: setting the boundaries for data analysis

What do you think of when you think of the word ‘privacy’? The NSA? Data breaches? The likelihood is that there isn’t a positive link. Privacy as a concept is intrinsically a good thing, but due to an exaggerated sense of paranoia around the subject, we now associate the term with spying, creepiness and intrusion.

Researchprivacy from Mindshare uncovered the key reasons consumers are suspicious of mobile monitoring possibilities. 59% of the respondents said they were concerned because privacy was a fundamental principle of human rights; 54% also cited ‘general creepiness’; and nearly half (49%) the assumption that the business will use the data to make another sale.

These suspicions aren’t misguided. According to MIT, 73% of Android apps and nearly half (47%) of iOS apps share personal or location data with third parties.

Concerns about privacy and the protection of personal data have been of steady importance to the UK media, with spikes of interest coinciding with high-profile personal data losses, such as Ashley Madison and Sony. With so much negative press, it’s time to look closer, and redefine privacy.

The best of both worlds

With social media and a proliferation of portable devices in the market, we are all creating more information about ourselves – and making it more widely available than ever before. There’s a tension between the idea of data analysis and the notion of privacy, but if used in the right way, data should have a beneficial impact on the consumer and add value without impinging on their private lives.

A waiter in a nice restaurant might recommend a bottle of wine to go with the meal their customer has chosen, or they might suggest certain dishes based on what the customer tells them that he or she likes. This is a perfect example of the value that consumers expect in exchange for offering insight into their private thought processes.

Eurobarometer research found that 74% of those surveyed see having to disclose personal data as an increasing part of modern life. Consumers are increasingly comfortable with the data-for-service trade-off.

Privacy and personalisation go hand in hand, but the lines are often blurred. People are ambivalent to how their data is used. Until they see headlines about a large-scale hack, that is.

When you download an app, often the app will ask for permission to certain information on your device. This is vital for consumers, as it’s a reminder of the agreement we make with companies: “I want to use your bespoke services, and I’m happy to give you access for this information in return”.

Data is currency, but that doesn’t mean it’s an alternative to traditional payment; it’s something the consumer exchanges for tailored content and a superior user experience. What the brand offers up in return has to be worth it – as with any transaction. But what happens to that data next?

Time to get personal

Permissions are more important to the relationship consumers have with companies when the information accessed is used to create a better customer experience. Targeted ads are nothing new; third party data is often collected to categorise people into generalised boxes. But no single person has the same, exact preferences of another. How can we possibly expect to receive the best personally curated customer experience if the information that’s being used to understand us is so simplified and generic?

With research showing the average person checks their smartphone 85 times a day – about a third of the time we are awake – it’s unsurprising that these devices hold a wealth of information about us. This data is invaluable when it comes to the creation of personal profiles and this is undoubtedly where the future of personalised brand experiences lies. But in exchange for access to this information, brands must ensure that it’s used to benefit the individual with truly bespoke suggestions; only then will we build trust. It’s like walking into your favourite shop, and employees knowing your needs and desires; “We’ve just got this in stock, and think you will absolutely love it”. They know to make the recommendation because of previous discussion.

In essence, this is what some companies are now aspiring to do, attempting to recreate the customer/shop assistant relationship in the digital age. But in order to build trust, there needs to be a pact between the user and the app or website – one that brands mustn’t violate by selling data on to third parties. Some companies might offer the consumer a cut of the profits from the sale of their data – but this is just putting a band aid on the problem rather than solving the issue. “We’ll sell your liver, but you’ll get some of the money back for it in return.”

Only by honouring this agreement and by making the exchange worth it to the consumer, will brands build long-lasting, loyal relationships. So that both parties can benefit.

By Ofri Ben Porat, co-founder and CEO, Pixoneye

NO COMMENTS