Privacy by Design: An idea whose time has come
Can privacy be dialled in to the digital world? John Leonard investigates the resurgence of an concept and why it is happening now
This article is the first in a series of pieces about Privacy by Design. More will follow over the coming weeks.
There was a lot of excitement about four or five years ago about a new concept called Privacy by Design (PbD). Papers were written, conferences organised and then, well, not much happened.
But NSA whistleblower Edward Snowden has given the concept a renewed lease of life. Originator of the term, executive director of the Privacy and Big Data Institute at Ryerson University in Toronto, Ann Cavoukian (main picture), says that she had seen a real change.
"I believe we owe Mr Snowden such a debt of gratitude," she tells Computing.
"Certainly my job has been made much easier. I never have a problem convincing a company or a government entity of going in the direction of Privacy by Design. Since Mr Snowden I don't even have to make the case any more," she says.
Companies and governments alike are facing a crisis of trust.
"They are coming to me and saying 'tell us how to do this we need to ensure we can do this to engage the trust of our customers and citizens. We no longer have that'."
Snowden's leaking of NSA documents started a conversation around personal data and surveillance. It meant that out of sight was no longer out of mind, and that concern about one's online privacy need no longer be equated with having something to hide. Once people discovered what the spooks have been up to they found it easier to express the sense of outrage felt by many about encroaching surveillance and the trade in our online selves without becoming the butt of "tin foil hat" jibes.
While efforts to give people more control of their personal data are many, they are scattered and disconnected. The umbrella of PbD has the potential of unifying them to serve a common goal.
PbD is now referenced by UK data protection watchdog the Information Commissioner's Office (ICO). Most importantly, the phrase "Privacy by Design" will be included in the forthcoming EU General Data Protection Regulation (see Article 23), the first time it has appeared in a legal document. Slowly but surely the big political beasts are shifting ground. But for the average punter the most obvious signal of the way things are moving has been Apple's approach with the iPhone.
Not only are iPhones and iMessage now encrypted by default, but Apple does not have the encryption key, meaning it can't hand it over to the authorities. Apple's CEO Tim Cook has also been severely critical of other tech firms who he says are dishonest in the way they mine their users' data for profit.
But even Facebook - one of the companies in Cook's sights - is in on the PbD act.
"Core to Facebook is trust", says chief privacy officer Erin Egan. "We understand that if people don't trust us then they're not going to continue to use Facebook. We are designing the principles of PbD into every product, every feature and every update."
What is Privacy by Design?
1) Being proactive rather than reactive;
2) having privacy as the default setting;
3) having privacy embedded into design;
4) avoiding the pretense of false dichotomies, such as privacy vs. security;
5) providing full life-cycle management of data;
6) ensuring visibility and transparency of data; and
7) being user-centric.
The simplicity of the foundational principles makes them easy to understand and to translate into other languages. But it can make them ambiguous and hard to apply without being caveated to death with the legalese that is the very enemy of online privacy and transparency (that's why only 1 in 20 of us read the Ts&Cs before clicking ‘accept').
So before getting into the detail, it's helpful to consider what we mean by privacy in the digital age, why we care about it (if indeed we do) and how we are trying to design for it.
Privacy means different things to different people. One way to get a grasp on it is to consider whose noses we want to keep out of our private business. Here's a simple list:
1) People
2) Private companies
3) Government and public sector
There are things we share with entities in one list that we don't want to share with another. And within each list there are subdivisions too. We don't share the same things with a taxi-driver or someone we meet at a party as we do with our families and friends. We treat each person in a different way depending on our relationship.
We share an awful lot with internet companies without even realising (have a look at Google Takeout if you want to find out just how much). In many ways they know more about us than we know about ourselves, human memory being frail in a way that hard drives are not, and we need to trust that they are not using our data to our detriment. And what about financial companies? How much does Visa know about your life compared with how much you know about Visa? And your supermarket?
In the public space we share intimate personal information voluntarily with our NHS doctor, trusting that it will go no further than is necessary to treat us; sharing the same information involuntarily with GCHQ is another matter entirely. Likewise with a private insurer.
Nothing to hide
The medium for our communications may be digital these days but psychologically humans have barely left the Stone Age. We'd be an extreme psychological oddity if we were happy to share the same things with our friends as some anonymous spook in Cheltenham. But we do that every day online. Whether the recipient is another human being or an algorithm really doesn't matter. Information is out there, potentially forever, and can be used in ways we can't control years into the future.
There are many, many reasons why those of us with "nothing to hide" would nevertheless rather not have to stand naked before the world in order to prove that point. Then there is the imbalance of power that the current situation brings about. Those insisting that we incrementally give up more and more data about our online comings and goings for the sake of "security" or even "extra features" tend to be notably protective of their own privacy, for example. What does this imbalance mean for democracy?
So, the "whys" behind privacy by design are many. They are also, by and large, uncontroversial. What has brought them into sharp focus now (as well as Snowden) is the final stage of negotiations of the GDPR. The tech giants are mostly American; the privacy laws in Europe are stronger than they are in the US and very likely to get stronger still despite the efforts of business lobbyists to defang them. If these companies are going to continue with their data-fueled global businesses in one of the world's biggest markets, they are going to have to get a lot more clever about how they treat that data - either that or play hardball and be prepared to lose a lot of trust.
A positive-sum game
It doesn't have to be an either-or game, says Ann Cavoukian.
"The whole theme of privacy by design is not a zero-sum. It's not privacy versus security or privacy versus business interests, its privacy and those things. How you do both? How do you have multiple interests in case at the same time and make it a positive-sum? It is eminently doable but we have to be really smart about it and innovative and that's the challenge."
For some time the tide has been moving away from the assumption that people don't care about privacy any more. That was probably never actually true - more a brushing under the carpet of the dirtier aspects of the shiny digital age - and it certainly isn't true now.
A recent survey of 4,005 UK consumers commissioned by Digital Catapult found that 60 per cent were uncomfortable sharing personal data, with 14 per cent saying they refuse to share any personal data at all (good luck with that!). Rather than being worried about the NSA, governments or private companies in particular, by far the largest number (76 per cent) said they were simply concerned about having "no control over how their data is shared or who it is shared with".
Facebook, which didn't get where it is today by ignoring the way the wind is blowing, has latched on the fact that a lack of control is people's biggest concern. A privacy and data security lawyer with and a practitioner of PbD, Erin Egan was drafted into the social media giant four years ago to apply its concepts into Facebook's services. She is particularly strong on the theme of control.
"We enable people to control the audience with whom they share their postings," she tells Computing.
"There are so many tools that enable people to control the data they have on Facebook. You can change the audience. We have a tool where you can take anything that you share publicly with one click and ratchet it down to share with only friends."
People can delete any data they put up on Facebook we are very happy to delete their data. We delete it from our servers within 60 days. It's gone. If people want to delete their data, we delete their data. Then there's the Download Your Information tool. They can download everything they've put on Facebook, close their account, and that's it."
Facebook also allows users to control what data goes to an application, she continues, and to control the type of ads they see through the ad preferences service.
"It's putting people at the centre. It's all about engendering control and building the trust." she says.
Fair enough. Certainly Facebook has introduced clearer, easier to understand privacy policies, and it is trying to educate users as to their options, and to design the controls to make them more intuitive - all very much in the spirit of PbD. In terms of the "people" aspect of privacy from the list above (allowing users to control what other people see) it certainly seems to be making progress. There has also been some public push back from the firm on the "government" side - as part of the Reform Government Surveillance coalition it has joined forces with other tech companies against the intelligence services' piggy-backing on its infrastructure (more about that in a future article).
But what about the second item on the list above: "private companies"? As an enormous processor of personal data, how much control does it give its users over what Facebook knows about them? On this the company is much less forthcoming. Even making its new tool that allows users to choose whether tracking data from "Likes" is used to serve them ads opt-in (i.e. consistent with PbD's second principle) rather than opt-out seems a step too far.
A different path
Could Facebook continue to innovate and grow with a different model, one that gives users more choice over what they share with the company? Will the new regulations coming out of Europe force companies whose business model depends on consuming ever more personal data to adopt a different path? There are people in the industry who believes they can and that they will. What's more they believe they should.
There is a whole ecosystem of start-ups and academics bristling with ideas about how this could work - although many of these ideas are still at an early stage.
We'll look more about the hows and the whos of PbD in a future article.
Privacy will also be a topic of duscussion in our Enterprise Security & Risk Management Summit on 26 November. Registration is free for most delegates.