In our first issue of 2022 we delve deeper into the world of Privacy Enhancing Technologies, otherwise known as PETs.

Our main story covers a recent workshop involving Meta’s Privacy & Data Public Policy Team and external experts. It introduces the concept behind PETs; what they are and why they are important. Not only is it a great primer for those new to the space, it also delivers some significant insight to anyone already knowledgeable about PETs.

We then move onto two real-world examples where PETs are being used. The first is healthcare, where privacy concerns contribute to only 3% of the sector's data being used. The second concerns Meta’s Data for Good programme and its response to COVID-19 – specifically balancing accurate data mapping with privacy protection. The latter also features an interview with Data for Good’s Alex Pompe, who provides an expert view.

We then round off with some research that explores how experts and non-experts view PETs, and what that means for US designers and privacy policy professionals.

What’s New from TTC Labs

Workshop | Privacy Enhancing Technologies

Private Message Banner

During December 2021 Meta’s Privacy & Data Public Policy Team and external privacy experts from academia, industry and elsewhere workshopped some of the issues around PETs. 

While the term is not new to many working in the area of data privacy, there is still a great deal to learn around the transparency, accountability, and assurance that PETs offer. PETs are based on cryptographic and statistical techniques, and while there is currently no universally accepted definition of which technologies and governance methods constitute a PET, the three main methods are data-shielding, data-altering, and computation-altering. 

The Data Dialogue was a mixture of talks and interactive workshops. Each aimed to tease out some of the tensions between the value of data and ensuring users are protected. This was done around two hypothetical scenarios involving a train company and a dating app. 

There were a number of thought-provoking takeaways from the session, especially around how sensitive data can deliver the highest value when planning a new train service, but also the most risk when using it. In the coming months, the participants will continue to look at how PETs can be used, which we will cover in future issues of this newsletter. 

Insights

Transparency champions | PETs and Healthcare Innovation

healthcare-pet

While the healthcare sector is said to generate 30% of the world's data, only 3% of it is used. Privacy concerns are a major reason for this discrepancy, unsurprising considering how sensitive much of that data is. However, a range of PETs are emerging to improve this. This piece from Codex discusses the five technologies that are central to helping researchers access this data, so they can begin solving some of the world’s most pressing healthcare concerns. 

Inspiring ideas | Data for Good Help Combat COVID-19

Data for Good is a Meta programme designed to help health researchers and NGOs better combat the pandemic, while protecting user privacy. It’s made up of population movement maps, based on the data of mobile Facebook users. To ensure their privacy, all data has been aggregated and had a differential privacy framework applied. The programme currently features an interactive map and dashboard showing COVID-19 and flu symptom rates across the world. Plus a Facebook Movement Range Data Map, which will help healthcare professionals understand how people are responding to distancing measures.  

Expert interview| Alex Pompe, Data for Good, Meta

D4G Lock Image

In the first in a series of interviews with privacy experts, we speak to Alex Pompe, Public Policy Research Manager at Meta’s Data for Good team. Here he explains what the team does and why, and how the lessons they learnt when working on a malaria prevention campaign in Africa, were used to help support the world’s COVID-19 response. Alex also details how he and his team work to support Meta’s R&D through emerging PETs. 

Research | Understanding Privacy-Enhancing Technologies

ttcl on device learning

To gain a better understanding of people’s experiences with privacy-enhancing, on-device learning, Meta’s UX Research Team has undertaken a survey of 16 experts and 16 non-experts. They found that both groups view on-device learning as complex, with some of the language used to describe it seen as a potential blocker to non-experts. Both groups suggested that any organisation who offers on-device learning would be better to focus on explaining the likely outcome of the technology, together with how it works. In addition, the research team were also able to identify five key pieces of information that non-experts would find helpful.