TTC Labs - Expert Interview: Alex Pompe on Data for Good & PETs

Trust, Transparency & Control Labs

News

Expert Interview: Alex Pompe on Data for Good & PETs

Taja Naidoo
Privacy Public Policy Manager, TTC Labs, Meta

This is the first in a new series of interviews with privacy experts which we’ll be sharing on the TTC Labs website. The interviews will cover a range of topics at the intersection of privacy and design. Our first interviewee is Alex Pompe, Public Policy Research Manager, Data for Good at Meta. The interview was conducted in November 2021 by Denise Butler, Privacy Policy Manager at Meta. Our thanks to both Alex and Denise.

D4G Lock Image

Welcome, Alex! Can you tell us a little about the “Data for Good” team at Meta and what you do?

Our job is to produce privacy protecting data sets that can be shared externally and used by researchers and professionals in a variety of fields.

Once we have these data sets, they can go to NGOs, governments, etc.  For example, we can share mobility data with governments to respond to wildfires. This data helps them know where to send firefighters and how best to allocate resources.

More recently, we’ve been looking at how human mobility changed because of lockdown during a pandemic, and how human mobility impacts pathogen spreading. Using aggregated movement data, we are helping quantify the economic impact of these events on local businesses and looking at their long term impact on climate change as well in terms of distance and number of trips taken.

How did Data for Good get started?

In 2017, a data scientist noticed that the Canadian Red Cross was responding to wildfires in northern Alberta by looking at where users were reporting on the Facebook app . She was able to aggregate the relevant data and add noise (Differential Privacy) which protected the privacy of users before sharing that information with first responders so they would know where to go.

Tell us about your COVID work?

We started tracking the impact of Covid-19 in January 2020 in a similar manner as we’ve done for other disaster responses. We were very quick to link up with our disaster response partners to see how we could help understand the spread of the disease using aggregated data which would reveal trends over time in movement patterns. We had already been working on these forms of aggregated mobility data for a malaria prevention campaign in Africa, and were able to pivot that work into a quick COVID response.

cta2

How do you work with other teams at Meta?

I primarily collaborate with Data Scientists (mostly on Core Data Science). They will have the expertise on a particular type of data captured in our products and they will have proposals on where and when they want to share it.

At the R&D level, we work with an engineering and product team called Central Social Impact - Data and Transparency. They build the websites, the visualizations and the surfaces which our external community interacts with.  We also spend a ton of time working with the privacy-focused policy and product teams sharing expertise on privacy-enhancing technologies (PETs).

We engage with Privacy Engineering on Privacy Preserving Machine Learning (PPML) and core PETs. We want to make use of those and showcase an example that is outside of an ads targeting model. We want a reservoir of approachable PET examples and Meta being an ethical contributor in the world.

Tell us about your work on PETs.

Most of the data sets that Data for Good builds have typical privacy protections (noise, minimum thresholds, etc.).

Before COVID, we wouldn’t release full data sets if there was any Meta user data involved as we wanted to protect our users. During COVID, we didn't have the means to onboard every government in the world onto our trusted partner program, so we figured out how to release this data with higher protections - Differential Privacy (DP).  DP quantifies reidentification risk for an individual.

Other PETs we are working on include Federated Analytics - with localized Differential Privacy -  Overall we are working to implement federated analytics to reduce the amount of user-level location history data collected. We are implementing a federated analytics approach to collecting location history data one step at a time to ensure that we maintain accuracy compared to the traditional (centralized) way of collecting this data.

This first step involves using the federated analytics approach to collect location history data from a small sample of 1.5% of Meta users on Android, and we're very happy with the results. We are able to collect the location history data that we need without any user-level data and actually some improvements to the accuracy! Next steps will be to rollout the experiment to more than 1.5% and to begin exploring client-side differential privacy implementation. In addition to this, we’re also building capabilities in MPC/Homomorphic Encryption - this allows health departments to run their models on our Platform.

Almost all of the work on Data for Good, especially around enhancing technologies, started organically through conversations with both internal and external collaborators. I’d encourage anyone who has ideas on new datasets and privacy protections to reach out to our program through our website and share your ideas!

Thank you, Alex!

Article

Privacy-Enhancing Technologies: Data Innovation and Design Considerations

Learn more
Research

How can companies help people understand privacy-enhancing technologies like on-device learning?

Read More
Research

Digital literacy insights can help improve privacy experiences

Read More
Taja Profile Photo

Taja Naidoo

Privacy Public Policy Manager, TTC Labs, Meta

Subscribe to our newsletter to get the latest updates

find everything you need to organise and conduct workshops
around the themes of trust, transparency and control.

Subscribe