Fides now supports e2e data subject rights fulfillment, free & open-source. 🚀

How NIST’s Privacy Engineering Objectives Can Help You Build A Privacy Engineering Program

To get started in privacy engineering, these three objectives can help you identify the first steps in embedding privacy and respect into your organization's tech stack.

Developing Your Privacy Engineering Roadmap

Your legal team has tapped you to start up a privacy engineering program. With mounting privacy risk and unscalable processes, they understand that the GRC function cannot manage privacy alone—engineers need to be a part of the picture.

You know the problems well. Data subject requests (DSRs) take days to fulfill. Privacy code reviews only occur ad-hoc, oftentimes months after problematic code has been deployed to production. The authors of your company’s privacy policy and the authors of the codebase are two mutually exclusive groups. You envision a future state with seamless DSRs, privacy reviews embedded in the SDLC, and policies enforced in the codebase. But getting there, or just getting started, can be overwhelming.

In this post, we dive into the three privacy engineering objectives from the National Institute of Standards and Technology (NIST). Combined with your deep knowledge of your organization’s tech stack, these questions can help you set the privacy engineering roadmap.

About NIST’s Privacy Engineering Report

NIST is a US federal institute within the Department of Commerce. Founded in 1901, it is one of the foremost authorities in setting benchmarks and standards for technical disciplines. In 2017, NIST released a publication titled “An Introduction to Privacy Engineering and Risk Management in Federal Systems.” The publication offers outcomes-based, high-level guidance for any organization building a privacy engineering program.

The authors propose three objectives for privacy engineering, which we paraphrase as:

  • Predictability: Eliminate moments where any stakeholder is caught off-guard by PII processing activities
  • Manageability: Implement fine-tuned controls for each stakeholder to exercise over data flows
  • Dissociability: Minimize access to PII so that the only stakeholders accessing it are the ones who have a need for it.

Like the Fair Information Practice Principles (FIPPs) or the Privacy by Design framework, this set of overarching principles can be a north star in the implementation of privacy engineering initiatives. As the authors note:

“A system should exhibit each objective in some degree to be considered a system that can support an agency’s privacy policies… System designers and engineers, working with policy teams, can use the objectives to help bridge the gap between high-level privacy principles and their implementation within systems.”
Next, we explore each privacy engineering objective.

Privacy Engineering Objective 1: Predictability

A predictable system does not throw surprises at anyone who uses it. With this informal definition, we can see three important questions to ask at the outset of building a privacy engineering initiative:

  • Who are the stakeholders that interact with this system?
  • What kinds of privacy-related events would surprise a stakeholder?
  • What kind of experience would a stakeholder have with a reliable, zero-surprises system?
Each question adds an important layer of detail to inform your privacy engineering roadmap.

Identifying Stakeholders in a Privacy Engineering Project

The first question is an important exercise in identifying the categories of data subjects and those responsible for managing part of the company’s data system. Data subjects might be grouped under labels like employee, customer, and job applicant. Managers of a data system might include data engineers who maintain the infrastructure storing PII and lawyers who maintain privacy policies. Keep this list of stakeholders on hand and up-to-date. Maybe you’ll store this list as a table—doing so can help with the next two steps.


Data engineer

Describing Incidents to Avoid in the Data System

The second question is an opportunity to game out scenarios to avoid. Describe events that would catch a stakeholder off-guard, and iterate over your new list of stakeholders. For example, a customer would be surprised if they started receiving emails from a business they’ve never heard of. In this hypothetical, your company shared the customer’s data with this third party without giving fair notice to the customer. Let’s add a column to our table:


Events that would surprise this stakeholder

Customer A customer gets an email from an unknown third-party business, because your company shared contact information unbeknownst to the customer.
Data engineer A data engineer is not able to access customer data needed for compiling a tax report.

Identifying Ideal Privacy Engineering Outcomes

The third question gets at an opposite scenario: identifying ideal outcomes. For each entry in your list of stakeholders, identify how they interact with PII in your system and for what purpose. Think about what, where, and why. As a basic example: a customer provides contact information and payment information (the what), both of which are processed in the customers database (the where), for the purpose of delivering shipments to the customer (the why).

Repeat this step for all of your data subjects, as well as internal managers of the data system. For instance, a data engineer extracts customer payment information from the customers database for the purpose of compiling it alongside all other recent transactions for tax reporting. We start to develop a well-rounded picture of a predictable system:


Goal-state for how this stakeholder interacts with PII in the data system

Customer A customer’s contact information and payment information are processed in the customers database to deliver shipments to the customer. No other data is collected.
Data engineer A data engineer is able to access customers’ contact information and payment information in the customers database, needed for routine business purposes. When additional permissions are needed, there is a standard procedure for reviewing, documenting, and approving those permissions.

The process is an opportunity to make sure that your company’s data infrastructure covers each of the what/where/why statements you created, and that it doesn’t extend into extraneous use cases. This can be a starting point for identifying privacy engineering projects and priorities. For an example of predictability in action, check out how our CEO Cillian can describe privacy behaviors in the codebase to root out unlawful use cases directly in the CI pipeline, before code ever touches users’ data. Code reviews against privacy policies help to minimize surprises and create reliable systems.

A code block, showing a privacy review in the terminal with the message

Privacy Engineering Objective 2: Manageability

Each stakeholder should have the right kind of controls over PII. Looking back at the list of stakeholders, different controls make sense for different stakeholders. For example, a California-based customer should be able to request a copy of their PII but not a copy of somebody else’s data. Looking internally, a data engineer should be able to carry out an access request but not calibrate a user’s consent preferences that the user has a legal right to express for themselves.

Under the “Predictability” objective, you have already specified how each stakeholder would interact with a zero-surprises system. Using this information, assess whether a technical system gives enough control to each stakeholder to carry out their roles. When receiving an access request, does a data engineer have the tools to comprehensively check all relevant data infrastructure for instances of the user’s personal data? On the flip side, ensure that each stakeholder does not have more control than they need. Are there technical measures in place to make sure that nobody but Jane Doe can receive a fulfilled access request for Jane Doe?

Privacy Engineering Objective 3: Dissociability

PII should remain dissociated from individuals under any circumstances, except when the system’s operations require a specific stakeholder to access this association.

Consider the different points of exposure for PII throughout your data flows. Are there opportunities to implement privacy-enhancing technologies that enable data processing without disclosing any PII? For instance, secure multiparty computation could allow your customers to submit sensitive information to a predictive model without disclosing this information to your data analysts.

Implementing technical measures that enforce dissociability while supporting innovative data processing is a sign of excellent privacy engineering. A tenet of privacy engineering is treating privacy and innovation as companions, rather than opponents. Embedding privacy in development starts and ends with having a clear picture of privacy risks and goals. Starting with these three objectives can help you begin building a reliable system with granular data controls and minimal PII exposure.

Privacy is embedded into cyclical processes that span design, development, testing, deploying, and reviewing.

Tools to Support Your Privacy Engineering Endeavors

The three objectives for privacy engineering from NIST offer a set of guiding questions to shape a nascent privacy engineering program. They provide a powerful complement to other frameworks like Privacy by Design or the Fair Information Practice Principles. Each of these frameworks gives a different perspective on important aspects of PII processing, and getting familiar with more of these frameworks will help you ask the most relevant questions for your tech stack and your company.

Now that you are developing your privacy engineering roadmap, you might be searching for technical tools to implement predictability, manageability, and dissociability. The Fides devtools power proactive privacy engineering in the CI pipeline and at runtime, translating privacy policies into the codebase.

We’re applying open-source devtools to the most high-profile privacy cases in recent years. This time, we build a solution to a landmark case in biometric privacy and purpose specification.
In recognition of Women's History Month, Ethyca recently hosted the Women in Privacy Career Panel, featuring a group of accomplished privacy leaders. It was inspiring and informative to hear these women share insights they've gained over their careers. From the panel discussion and Q&A, we identify three common threads from the panelists when it comes to building a career in privacy tech.

Ready to get started?

Our team of data privacy devotees would love to show you how Ethyca helps engineers deploy CCPA, GDPR, and LGPD privacy compliance deep into business systems. Let’s chat!