Privacy is fundamentally social. It’s about who we trust with the information that makes us who we are. This article is part of the series “Privacy By Design, By All Of Us,” highlighting a few of the people at Ethyca who are engineering the future of data privacy. Meet Dawn Pattison, Senior Software Engineer.
One of the many highlights of working in privacy is learning how each privacy engineer finds their way to the field. Dawn Pattison brings a wealth of engineering experience, within and beyond software, in her steadfast commitment to building respectful systems.
Dawn engineers privacy tools that enable developers across the globe to build more respectful tech stacks. In doing so, she makes it easier for companies to do right by their end-users in handling personal data. Outside of work, she enjoys camping in East Texas, and she’s taking bluegrass singing lessons. (There’s no shortage of musicians among our engineers. Check out our previous interview with Software Engineer Catherine.)
We spoke with Dawn about her work, its connection to broader topics in privacy, and lessons learned along the way.
How does your day-to-day work build respect into the ways end-users’ data is handled?
On my team, we’re always looking at how we can better treat user data carefully, at every stage from design to implementation to code review. My team and I, we’ve been building a product that lets users request to see the data an organization has collected about them, or request that such data be deleted. In order to facilitate this process, the privacy tool needs to know certain details. And we’re trying to be very careful about how we handle those details. For example, when software engineers connect their databases to the privacy tool, they might need to supply secrets like a host, port, username, and password. We encrypt that information. Another example is when the privacy tool is handed an email or a phone number to locate users’ data, we store that very temporarily in a cache that automatically expires. We’re always trying to be mindful of privacy and security at every stage.
What does Privacy by Design mean to you, in your own words?
Engineers are trained to care about building high-quality products. Privacy by Design means that for a product to be considered high-quality, it must treat user data carefully.
Is there a particular principle of the Privacy by Design framework that speaks to you and your role at Ethyca?
Privacy Embedded into Design—that principle really spoke to me. Before I worked in software, I was in medical device manufacturing. We made dependable products that improved patients’ lives. But we didn’t make those high-quality products by accident. That company had developed very detailed quality control processes, all the way from design, to building machines that manufactured the product, to sampling raw materials. We tried to minimize, in every step of the process, the likelihood that a product would fail the patient in the field.
Similarly, when building high-quality products as software engineers, it’s not enough to say “we care about data privacy and we hope it automatically works its way into our product.” It has to be built into our processes so we’re all aware of it, and we must consider it every step of the way.
Since you’ve started working in privacy, what has surprised you or challenged your assumptions about the field?
For me, I was naive about the costs associated with an organization becoming compliant with new privacy laws. It’s one thing for us to say “Consumers are demanding that their data be treated more respectfully, so these companies just need to do better.” We need to keep the trade-offs in mind. I was learning about the CCPA and how the state of California estimated that it would create significant costs for businesses. I read a report that said it would cost $55 billion for California businesses to implement, affecting 75 percent of California companies, and disproportionately so among small businesses. It’s a matter of realizing that there are not unlimited funds. But that makes what we’re doing at Ethyca that much more important. If we’ve decided that privacy is the problem we want to tackle, we also need to acknowledge that it’s an expensive problem. The fact that our company is leading conversations about tools that are free and open source to help engineers better handle user data from the beginning—that’s really exciting. It’s a solution that’s aware that data privacy is not an easy problem, and it’s not a cheap problem either.
You’ve spoken to your background, about working in medical devices, where the stakes seem quite clear: patients’ health and lives at stake. Thinking about privacy and in general, why do you care about respectful systems?
First of all, I think about the problems: Your data can be stored forever, and a lot of information can be implied about you from seemingly benign information. We might not be able to conceive today of the ways this information might be used in the future. It’s a mistake to say things like “I don’t have anything to hide,” or “This company is just collecting information about my shopping preferences, so who cares about that?” That’s really shortsighted.
When thinking about respectful systems, a lot of the burden is on consumers right now. It’s very difficult to separate which organizations are being good stewards of your data. Privacy policies are verbose and difficult to wade through. It’s hard to opt out of your data being collected in the first place. As a consumer, this whole landscape is frustrating to deal with. So as a software engineer, it’s exciting to get to show up every day at work and build cool solutions.
Dawn and the entire team of engineers at Ethyca are building cutting-edge privacy tech. Visit ethyca.com to explore their latest innovations to make meaningful privacy a reality for all users.
Ethyca launched its privacy engineering meetup, P.x, where Fides Slack Community members met and interacted with the Fides developer team. Two of our Senior Software Engineers, Dawn and Steve, gave presentations and demos on the importance of data minimization, and how Fides can make data minimization easier for teams. Here, we’ll recap the three main points of discussion.
We enjoyed two great days of security and privacy talks at this year’s Symposium on Usable Privacy and Security, aka SOUPS Conference! Presenters from all over the world spoke both in-person and virtually on the latest findings in privacy and security research.
At Ethyca, we believe that software engineers are becoming major privacy stakeholders, but do they feel the same way? To answer this question, we went out and asked 337 software engineers what they think about the state of contemporary privacy… and how they would improve it.
The UK’s new Data Reform Bill is set to ease data privacy compliance burdens on businesses to enable convenience and spark innovation in the country. We explain why convenience should not be the end result of a country’s privacy legislation.
Our team at Ethyca attended the PEPR 2022 Conference in Santa Monica live and virtually between June 23rd and 24th. We compiled three main takeaways after listening to so many great presentations about the current state of privacy engineering, and how the field will change in the future.
For privacy engineers to build privacy directly into the codebase, they need agreed-upon definitions for translating policy into code. Ethyca CEO Cillian unveils an open source system to standardize definitions for personal data living in the tech stack.
Our team of data privacy devotees would love to show you how Ethyca helps engineers deploy CCPA, GDPR, and LGPD privacy compliance deep into business systems. Let’s chat!Book a Demo