The Seven Core Principles of Privacy by Design



February 6, 2018 by Felix Bauer

As both the risks and opportunities of data increase, there is a corresponding increase in pressure on companies to build what’s known as “Privacy by Design” or PbD into their data practices.

This idea comes from Ann Cavoukian, the former Information & Privacy Commissioner of Ontario. With the notorious GDPR set to take effect in May of 2018, privacy by design will become a legal requirement (rather than a nice to have) for any business subject to the new regulation.

That’s because PbD is built into the very language of the GDPR. Article 25 of the GDPR, Data protection by Design and by Default, states that:

…The controller shall, both at the time of the determination of the means for processing and at the time of the processing itself, implement appropriate technical and organisational measures, such as pseudonymisation, which are designed to implement data-protection principles, such as data minimisation, in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of this Regulation and protect the rights of data subjects.

There has been some lack of clarity about what exactly “Privacy by Design” means. So to help clarify, we return to the words of Cavoukian. In defining the concept, she set forth seven key principles that make up the philosophical heart of PbD. Here they are, each accompanied by a brief explanation of how Aircloak Insights helps implement that principle for its customers.

Proactive not Reactive; Preventative not Remedial. The time to think about privacy is at the beginning of the search process, not after a breach. Since Aircloak Insight is created with privacy protection as its utmost goal, we spent a lot of time building this into the core of everything we do for our clients.

Privacy as the Default. Giving customers maximum privacy should be a baseline part of your offerings. This might include things like explicit opt-ins, consumer data safeguards, restricted sharing, minimized data collection, and clear retention policies.

Privacy Embedded into Design. Software developers are often tempted to put security features on the back-burner in their rush to implement core functionality, and testing from common hackable vulnerabilities often fall through the cracks as well. *PbD dictates that privacy should be seen as core functionality and prioritized appropriately.

Full Functionality — Positive-Sum not Zero-Sum. This is all about creating a PbD culture, one where privacy policies drive rather than hinder revenue and growth.

End-to-End Security — Full Life Cycle Protection.* *It’s important that PbD privacy protections should follow your data wherever it goes, from the time it’s created, shared, and finally archived.

Visibility and Transparency. This principle builds consumer trust, a significant competitive advantage. Privacy disclosures should be easily available and written in non-legalese. Customers need a clear redress mechanism and lines of responsibility need to be clearly defined.

Respect for User Privacy. PbD makes it clear that users are the owners of their data. Users must be given the ability to update and correct any data held about them. They must also be the only ones who can grant and revoke access to their data.

How Aircloak Implements Privacy by Design

Here at Aircloak, we live and breath Privacy by Design. It’s what we spend all day obsessing over: using privacy protection to drive value. And knowing the vital responsibility we have for the data we are entrusted with, we never take shortcuts.

For example, all user data are regarded potentially identifying by Insights, and all systems are set up by default with noise and aggregation levels that fully anonymize. Raw data can be edited and even deleted at any point. All end users can have full control over their data, if you set up the right structures.

But you don’t have to take our word for it. Aircloak’s approach is one of the few to be fully public in peer-reviewed papers. We’ve even created a bounty program challenging anyone to try to de-anonymize data processed by insights. We’re as transparent as it gets in anonymization.

Interested in learning more? Get in touch.


Categorised in: ,