Subscribe

Privacy is dead; long live privacy

Over two decades ago – on August 25, 1997 to be exact – the front cover of Time magazine declared the Death of Privacy.  

Click image to zoom Tap image to zoom

The cover story came a single year before Google became the world’s search engine, seven years before Facebook emerged to connect over two billion people and nine years before Twitter exploded with 500 million tweets a day.

"We need to adopt a privacy-by-design approach which embeds privacy and ethical standards into all initiatives which collect, manage and use data to proactively protect customer rights.” 

Following in Time’s footsteps, 17 years later Forbes declared - perhaps even more dramatically - Privacy is Completely and Utterly Dead, and We Killed It.

Forbes puts the guilt squarely in the hands of us, the consumers, telling us we were guilty of killing privacy with insatiable searches, a tendency to respond to every online survey or competition with a ‘free’ prize and our non-stop online shopping addiction.

For some the trade-off between handing over personal information for the privilege of buying coffee with a swipe of your phone or receiving alerts for cheaper flights are worthwhile trades. To these consumers, losing a little bit of privacy for the sake of convenience is reasonable.  

For those with a keen sense of privacy it is isn’t easy to wade through - let alone understand - pages of legal-speak privacy statements or figure out how to manage the settings on all their apps.

Organisations have not made it simple for consumers to control their preferred level of privacy and consumer voice has traditionally not been loud enough to trigger much investment in greater data control above what is required by local legislation.

In March 2018, the game changed. The Facebook and Cambridge Analytica scandal drew global attention to the power of data. Through the use of personal information gleaned from Facebook, data scientists were able to obtain a picture of a person’s character better than those closest to them - and in some circumstances better than their spouse.

Millions of people impacted in the scandal were largely unaware their personal information had been retained, was being used to produce psychometrics about them and in some cases utilised to manipulate their emotions.

Facebook users provided their data to a third-party app for the primary purpose of being provided with a point-in-time profile. But information was kept and used for other commercial purposes which were not transparent to the user.

The psychometrics highlighted to Cambridge Analytica the emotional vulnerabilities of each person profiled, and those vulnerabilities were exploited through non-transparent means - such as non-verified news items and online articles designed to swing their voting preferences.

It did not pass the ‘reasonable use’ test in the eyes of those impacted, nor to the rest of the world.

I’m back

With Facebook CEO Mark Zuckerberg recently enduring 10 hours of grilling by the US Senate and House Representatives, the privacy debate has been globally resurrected.

Canadian, Australia, Irish, US state and federal agencies and EU Privacy Commissioners have all launched investigations and new EU General Data Protection Regulation laws came into effect on May 25.

Privacy is back haunting debates in legislative chambers, senate floors discussions, media and consumer forums because data was allegedly used to manipulate people, pushing their emotional buttons to change the way they think and feel - and ultimately influence society. All of this shows how powerful data can be and why the privacy debate can never die.

On a practical level, the Facebook scandal has created a surge of people asking for understandable terms and conditions, changing their privacy settings (when they can find them), and asking for details about when, where and how their personal information is being shared with third parties.

It has taken a worldwide breach of trust to arouse privacy from its slumber. For business, it’s also a huge wake-up call – but there are positives.

There is a big opportunity for organisations which capture, manage and use personal information - like banks - to get on the front foot and improve their privacy settings, provide more-transparent and nuanced consent management frameworks than the current binary opt-in and opt-out systems and take a more consumer-centric view of privacy.  It isn’t just a compliance or legislative requirement - it is right thing to do.  

Indeed, from a compliance requirement organisations may not have a choice. Given almost all global corporations buy, sell, or use personal data, the new EU legislation - GDPR- is legislating for a stronger, pro-consumer data protection regime.

This means greater demands for transparency on how and where personal information is being used, simple consent offerings so consumers can more easily object to direct marketing and profiling, 72 hour data breach notifications, and the right to have personal information deleted.

Click image to zoom Tap image to zoom

Source: Chris Kelly

Control

It’s time for organisations to provide tools which give customers more control. 

This includes the freedom to choose to apply a more-strict or more-lax privacy and security setting on their personal information depending on what they are doing online – depending on their various digital personas.

We have various online personas: one we use for digital banking, one we use for sharing family photos (think Instagram), one for tracking our health including sleep, exercise, eating, medication, prescription renewals (think Apple Health app) and even one for our career, think about what you share on LinkedIn compared to what you share on Facebook.

We want the option of applying different privacy settings for our different personas. Extra security and privacy for family photos on Instagram, less so for our tracking our latest success with the Coach to 5K app.

The challenge is we have traditionally designed a one-size-fits-all privacy approach which is compliance based rather than proactive.  This approach has tried to retrofit privacy principles and standards into new digital services and offerings after these new capabilities have been operationalised.

We need to adopt a privacy-by-design approach which embeds privacy and ethical standards into all initiatives that collect, manage and use data to proactively protect customer rights. We need to design privacy services which enable people to choose their privacy preferences depending on their varied digital personas.

Give consumers more options rather than the binary ‘opt –in or opt out,’ make privacy settings easy to change as Facebook recently did combining them all on a single page.  Explain who your third parties are and what they will use your information for and then offer consumers the right to say no.  Write terms and conditions in jargon-free and non-legal language.

It’s time to reconsider your organisation’s consent framework, your privacy and security rules and pay more attention to your Ts & Cs.

Privacy settings, data protection rules, cookies, APIs - if you can’t explain them to your 83 year old Auntie Doris it’s unlikely you are doing a good-enough job.

Michelle Pinheiro is Head of Enterprise Data Governance at ANZ

Illustrations provided by Chris Kelly, corporate caricatures & illustrations.

The views and opinions expressed in this communication are those of the author and may not necessarily state or reflect those of ANZ.

editor's picks

10 May 2018

Banking & the first-mover paradox

Michael Scotton | Head of Security Strategy, Policy & Outreach for Technology, ANZ

Data presents massive opportunities for financial services groups willing to innovate – but it also brings its fair share of risks.

18 May 2015

What price cyber risk?

Andrew Cornell | Past Managing Editor, bluenotes

In the financial crisis there was actually little correlation between financial institutions which ticked all the boxes on regulatory compliance, across any number of measures, and those that failed or flourished.