It’s hidden in plain sight. In order to successfully manage data privacy, companies must successfully manage data. Engineering teams must extend the technical workflows and processes for other realms of data operations, such as ETL and analytics, to privacy management within the development life cycle. In doing so, they practice
Per privacy engineering is:“…an emerging field of engineering which aims to provide methodologies, tools, and techniques to ensure systems provide acceptable levels of privacy.”
Here at Ethyca, we like to think of privacy engineering as the act of baking privacy into systems rather than bolting it on after the fact.
From an engineering perspective, building capacity in privacy engineering will relieve operational pains and allow teams to focus on product-building with confidence that their work is compliant.
From a regulatory standpoint, privacy engineers bring deep value to an organization. By interfacing with engineers and Governance, Risk, & Compliance (GRC) experts, they ensure that technical operations align with legal requirements through elegant workflows. They help an organization work smarter, not harder, on privacy.
Here, we discuss six skills—three technical skills and three GRC competencies—that, once acquired by dev team members, will harmonize engineering and compliance needs. Throughout, we refer to engineering teams, but the guidance equally applies to data and product teams, too.
One of the hallmarks of modern privacy ops is the DSR: the data subject request, sometimes called a data subject access request (DSAR). From GDPR to CCPA and beyond, more individuals are being granted rights to request that companies access, update, or erase the personal data they hold on them. While GRC teams might oversee the DSR fulfillment process, engineering teams are the ones with boots on the ground, handling tasks like data discovery and data deletion. Skills required to execute the full suite of DSR types are vital, but we focus on erasure here as an illustrative example.
When an engineering team builds a product, such as an app for online payment processing to support financial transactions, privacy engineers design and build systems to protect the data flowing through that product. Privacy is not the systems’ primary purpose, but it is a key component of the design process, now more than ever. For compliant, trustworthy, and efficient systems, privacy engineering is becoming a core focus for businesses worldwide.
Building an erasure pipeline requires working knowledge of contemporary data architecture: data warehouse platforms like Snowflake, caches such as Redis stores, among others. Backend engineers tasked with building these pipelines should be equipped with knowledge of SQL and NoSQL databases such that instances of the requester’s personal information are correctly identified for erasure.
After the requester’s personal information is identified, the appropriate erasure strategy should be applied. Simply putting a null value in the database is not always the prudent move. From a technical standpoint, certain entries must retain specific data types or structures. Otherwise, the erasure could disrupt downstream data operations. Informed by existing data operations, erasure strategies should include some form of each of the following:
For instance, the most appropriate erasure strategy might be to erase certain categories of personal data using a fixed string “MASKED.”
Usability is a bottleneck on impact in privacy engineering. Beyond developing products that respect users’ data, engineers must evaluate the uptake of those innovations by their intended audiences. In collaboration with product designers, engineers should develop quantitative and qualitative metrics and modify the product accordingly. Depending on the nature of the product, this responsibility can involve A/B testing, questionnaire design, or backend analytics on site performance. Because of this, privacy engineers work on both customer-facing and internal challenges.
It’s important to note: some of the very tactics for improving usability can introduce privacy risk! Make sure you weigh those risks up while planning for usability testing.
Externally, engineers should identify where end-users are experiencing obstacles in implementing the privacy product. They should understand and root out dark patterns in UX: any design patterns that cause users to act against their own best interests. For instance, obscure language or presentation of privacy controls can prevent users from making an informed choice.
Internally, engineers might collaborate with in-house data scientists to develop a privacy-preserving toolkit for analyzing users’ product usage in aggregate. In consulting with the data scientists, the engineers will adapt the toolkit to ensure that it is as low-friction as possible, compatible with existing system requirements without compromising technical privacy protections such as
or .
Data is moving faster than ever, in greater volumes than ever, and to more places than ever. At the same time, the international data flows are complex and shifting. On the technical side, engineers are responsible for developing workflows in cloud-computing platforms like AWS that abide by international requirements while being ready to adapt to amended data-transfer agreements. To do so, they validate against frameworks like access control lists (ACLs) to ensure that governance models are translated into entity-level access controls.
Depending on the size and strengths of the GRC team, a GRC specialist may already keep close tabs on the upcoming privacy regulations relevant to the company. However, the privacy-informed engineer plays an indispensable role as a liaison between GRC and engineering teams. They should be familiar with the business requirements of landmark regulations like the European Union’s General Data Protection Regulation (GDPR). Even though the regulations that have followed in GDPR’s wake are not identical, understanding GDPR provides a widely applicable framework for parsing new regulations worldwide.
The concept of extraterritoriality is essential for anyone in privacy to understand. By and large, it does not matter where a business is headquartered, but where its end-users reside. Because of this, GDPR applies to companies with EU users, not just companies with headquarters in the EU. By monitoring regulations and understanding how best to update technical systems, engineers can reduce friction throughout not only their team but also GRC and product teams. The
from the International Association of Privacy Professionals offers a great starting point for getting up to speed on the latest state-level and global regulatory news.
While data erasure is an involved technical process, it also requires a GRC sensibility regarding if, when, and for how long to retain specific data. Global privacy regulations carve out exceptions for erasure, where some data should not be erased in order to meet business requirements. For example, customers’ purchasing information is generally considered personal information, so it seems to be in the scope of an erasure request. But tax requirements generally require such data to persist for a set period, before deletion is permitted.
An
engineer supports a healthy erasure workflow
—in which the data erased is precisely what’s needed according to the law—by interfacing with the GRC team to understand what business requirements limit the extent of data erasure. As with much of privacy engineering, the engineer must be ready to translate across disciplines in these GRC conversations. They must know, for instance, how to apply legal counsel’s guidance on tax compliance into fields in distributed data systems.
Consent is a cornerstone of modern privacy, and violating it has led to some of the industry’s largest fines. For instance, the largest GDPR fine to be finalized, over $56M against Google, arose from consent violations. The largest GDPR fine to be levied, not yet finalized, also related to consent violations: $877M against Amazon. Engineering teams should have personnel who understand the basic principles of consent in data processing contexts. For instance, the personnel should know the distinction between opt-in and opt-out approaches to consent. Any systems that perform operations requiring users’ consent should request consent that is in line with
of GDPR: it is “a freely given, specific, informed and unambiguous indication of the data subject’s agreement.”Privacy engineering is younger than the related field of security engineering, which has become a core competence for teams shipping enterprise-grade software in recent years. Privacy engineering is on track to follow a similar path, as organizations recognize it as the scalable, sustainable means to build respectful and compliant products. In the years to come, engineering teams will likely see a greater number of specialized programs for training in privacy engineering.
In the meantime, building capacity in privacy engineering not only reduces friction in workflows—it can also confer a competitive edge. In other words, privacy engineering isn’t just about avoiding fines; by proactively embedding respect into data systems today, a company can build a genuinely trustworthy brand that will stand the test of time.
Our team of data privacy devotees would love to show you how Ethyca helps engineers deploy CCPA, GDPR, and LGPD privacy compliance deep into business systems. Let’s chat!
Speak with UsStay informed with the latest in privacy compliance. Get expert insights, updates on evolving regulations, and tips on automating data protection with Ethyca’s trusted solutions.