Subscribe to DSC Newsletter

PETs: the technologies organization should consider adopting

  • some examples of PETs and their applications;
  • the framework of Privacy by Design in which PETs exist;
  • the opportunities these technologies hold, besides compliance.

Privacy Enhancing Technologies

Under this category, we find a collection of tools and methods. The underlying technologies are quite diverse. But they all have one core purpose: to mitigate privacy risk and protect personal data.

PETs categories & the privacy risk taxonomy

It’s hard to be exhaustive when listing PETs. They regroup a diverse group of tools and approaches. The same way we don’t have a unique definition for them, there is no commonly accepted taxonomy.


Taxonomy for PETs suggested in a study for the Danish Government (Meta Group 2005)



A simple representation of the data lifecycle



Various risks from Solove’s taxonomy of privacy threats.


Examples of PETs

1) Trusted Execution Environments (TEE)

TEE hardware technology focusing on securing the processing of sensitive data. They are secure areas for the data inside a processor. They keep the data safe and inaccessible during its processing.

2) Zero-Knowledge Proofs (ZPK)

ZKP technology is a set of methods to process a statement without revealing sensitive data. ZKP involves two parties: the prover and the receiver. One convinces the other that it posses a piece of information without having to expose it.

3) Secure multi-party computation (MPC)

MPC is a collection of protocols to process data from many sources. The particularity is that it doesn’t reveal the sources to each other. You combine sensitive data for analysis and get access to the results. But you never see the other data inputs. It a subtype of ZKP.

4) Differential privacy

Differential Privacy is an approach that preserves data privacy when processing data. You do so by adding noise to the results of the queries on a dataset. It removes the possibility of identifying someone in the data. But it also keeps the statistical correctness of the data.

5) Homomorphic Encryption (HE)

HE is a set of methods to process the data without ever exposing it. It allows running operations on an encrypted form of the sensitive data. You get access to the results but you’re never exposed to the data itself.

6) Synthetic data

Data synthesization is also an approach designed to help protect data privacy. Instead of working directly with sensitive data, you generate synthetic data that holds the same overall statistical properties as your original data. Synthetic data points are less likely to trace back to the original training examples, thus helping to protect their privacy.

PETs and the concept of Privacy by Design

Many regulatory frameworks mention the concept of Privacy by Design. The same discussion that introduced PETs as means of protecting privacy 25 years ago, also introduced the concept of Privacy by Design.


The 7 Foundational Principles of the Privacy by Design framework — For the complete description, see the report.

From ‘Privacy’ to ‘Data Protection’ by Design

2016 was a big year for PETs. With article 25, GDPR introduced “Data Protection by Design” into the regulatory landscape.

The interest of PETs for businesses

Besides compliance with data protection laws, PETs present several advantages:





Views: 95

Tags: analytics, anonymization, compliance, data, learning, machine, privacy, synthetic

Comment

You need to be a member of AnalyticBridge to add comments!

Join AnalyticBridge

On Data Science Central

© 2020   AnalyticBridge.com is a subsidiary and dedicated channel of Data Science Central LLC   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service