Our team at Ethyca attended the PEPR 2022 Conference in Santa Monica live and virtually between June 23rd and 24th. We compiled three main takeaways after listening to so many great presentations about the current state of privacy engineering, and how the field will change in the future.
The heat was on at PEPR last week! The annual conference on Privacy Engineering Practice and Respect is one of the world’s premier gatherings for privacy engineers in industry and academia. Virtual and in-person participants presented for two days, addressing a variety of technical challenges in privacy, such as differential privacy and threat modeling, as well as more user-facing concerns like consent controls and privacy labels.
The conference affirmed that, more than ever, engineering plays a central role in mitigating privacy risks and building trustworthy technology. In this blog post, we reflect on the top three themes from PEPR 2022.
A common theme that presenters touched on was usability throughout the data lifecycle, from user experience upstream to software development. For example, Dr. Lorrie Cranor and her team’s research on cookie banners highlighted the pervasiveness of dark patterns in users’ controls when they visit websites. The ambiguity around the purpose of “performance” or “functional” cookies can confuse consumers and prevent them from exercising meaningful control over their data. Presenting users with more information does not make a privacy control more effective; without considering ease of use and readability, it can overwhelm users.
Research from Dr. Cranor’s team also cautions businesses against requiring users to go to a separate page to limit cookies options. In fact, the research finds that it’s more appropriate to not include cookie banners at all—as long as the website is legally in the clear to implement such a design.
Like with end-users, software developers need clearly defined controls around the privacy behaviors in their code. Talks by Dr. Vadym Doroshenko and Dr. Michael Hay addressed the importance of developer-friendly tools for implementing differential privacy. When privacy software developers are able to enforce sophisticated privacy guarantees on their data releases, the benefits carry downstream. For instance, the product team might benefit from privacy-preserving product analytics with less need for data suppression.
A number of researchers presented on labeling systems for privacy behaviors, focusing on usability for developers. Marc-Antoine Paré shared a system for describing the wide variety of personal data collected by an autonomous vehicle’s data systems. Paré highlighted how a straightforward privacy labeling system will enable a number of core privacy engineering tasks, such as ad-hoc data mapping and fulfilling subject access requests.
These talks on usability affirmed that intuitive and descriptive privacy tools will help expand the adoption of privacy engineering practices.
Another theme from the PEPR conference was the need to standardize privacy engineering best practices. Many speakers from different organizations amplified the need to establish basic terms and definitions for privacy engineering.
For example, Cara Bloom presented a taxonomy for describing components of privacy threats in order to build a privacy threat model. This hierarchical ontology was created to help risk management teams and organizations better understand what they are defending against, and make more threat-informed decisions to protect users’ privacy.
Bloom’s privacy threat taxonomy was just one of the calls for privacy professionals to develop a standardized way to talk about privacy. Eivind Arvesen also brought up the need to establish the basics of privacy engineering in his talk about privacy design flaws. He argued that developers are still ill-informed and ill-equipped to design data architecture that enables user privacy.
“We need to empower and enable non-privacy experts with technical resources, and better communication between ourselves.”
Arvesen concluded that creating standards for privacy engineering best practices will help developers design more trustworthy data systems, and communicate better across teams.
PEPR presenters mentioned how a vocabulary of clearly-defined terms and relationships to describe privacy behaviors are the basis for nuanced privacy engineering at scale. This mindset closely aligns with our own devtools approach to privacy engineering, built on the Fides taxonomy. Dr. Harshvardhan J. Pandit echoed the same sentiments in his talk about Data Protection Vocabulary (DPV), which described a taxonomy for data modeling and data privacy. Companies can benefit from using the DPV taxonomy because it will:
With so many academics and businesses creating their own taxonomies, it suggests that taxonomy-based approaches are the best hope for addressing core privacy engineering issues. At Ethyca, we believe in this approach, too. That’s why we created our own Fides taxonomy to describe data types.
There are certain characteristics that we believe make the Fides taxonomy uniquely suited to practical implementation. Fides Lang is lightweight and extensible, so it lowers the barrier of entry for developers to start engineering privacy into their systems. What’s more, there is a suite of tools that developers can use in the Fides platform to solve today’s pressing privacy issues (issues like data mapping, data subject requests, or automated privacy checks in CI) that are built on top of Fides Lang. This is just one example of how to use a taxonomy-based approach as a foundation to solve privacy problems.
Data minimization was top-of-mind at PEPR: how to restrict data collection and processing only to necessary business activities. The concept is made all the more topical with its prominent place in the proposed American Data Privacy and Protection Act (ADPPA), which was formally introduced in the U.S. House of Representatives just as PEPR began. The privacy community has highlighted how the ADPPA terms on data minimization could transform expectations on how software gathers and processes personal data.
Several speakers referenced the tensions between data minimization and technical privacy guarantees made possible by privacy-enhancing technologies. For instance, differentially private analytics are relatively less noisy when the dataset is larger; there is a lot of math at work in this behavior, but it can be helpful to think how it is easier to obscure an individual within a large group than in a small group. While more data would increase accuracy and reduce statistical noise, it also pushes against the principle of data minimization. This tension is especially relevant when, as was the case for some of the PEPR speakers, the data in question is involved in high-risk processing.
If one privacy mechanism requires more-than-minimal data collection, how can privacy engineers and analysts balance the simultaneous needs? It’s an open question, and it often depends on the privacy needs of a given situation.
Implementing data minimization can also improve users’ experiences, as in the case of processing gender and other demographic data. To take the case of gender data: systems that collect or assume gender data could be gathering information that is inaccurate, misleading, or posing greater privacy risk than the user might anticipate. Furthermore, a company might not have a legitimate business purpose to collect or process gender data in the first place. When such data is not necessary, halting its collection and processing is not only beneficial from a privacy risk standpoint—it also stands to reduce potential discrimination and make fairer tech.
That concludes this year’s PEPR Conference. Our team at Ethyca is honored to have once again sponsored such an incredible event in the service of promoting great work in privacy engineering. We saw an array of amazing talks about what is happening in the privacy engineering landscape today, and how things will change in the future. We Ethycans leave PEPR 2022 excited for the future of privacy engineering and Privacy-as-Code. Feel free to explore the full PEPR program to engage with the research yourself, and watch for the full recordings of PEPR talks on the USENIX website in the coming weeks.
A big final thank you so much to all of the speakers and the event organizers!
Ethyca announces fundraise, doubles annual revenue with new enterprise clients, and reveals new brand.
Today we’re announcing faster and more powerful Data Privacy and AI Governance support
See new feature releases enhancing user experience, adding new integrations and support for IAB GPP
Learn more about the privacy and data governance enhancements in Fides 2.27 here.
Read Ethyca’s CEO Cillian Kieran describe why and how an open data governance ontology enables companies to comply with data privacy regulations and frameworks.
Ethyca sponsored the Unpacking Privacy Engineering for Lawyers webinar for the Interactive Advertising Bureau (IAB) on December 14, 2023. Our CEO Cillian Kieran moderated the event and ran a practical discussion about how lawyers and engineers can work together to solve the technical challenges of privacy compliance. Read a summary of the webinar here.
Our team of data privacy devotees would love to show you how Ethyca helps engineers deploy CCPA, GDPR, and LGPD privacy compliance deep into business systems. Let’s chat!
Speak with UsStay informed with the latest in privacy compliance. Get expert insights, updates on evolving regulations, and tips on automating data protection with Ethyca’s trusted solutions.