Across the tech sector, there’s widespread consensus that a trust deficit threatens to undermine the current business model of quality, ad-supported content. This mistrust exists between data subjects, data controllers, and data processors (to use GDPR parlance).
Across the tech sector, there’s widespread consensus that a trust deficit threatens to undermine the current business model of quality, ad-supported content. This mistrust exists between data subjects, data controllers, and data processors (to use GDPR parlance). Users don’t trust that the sites they visit are behaving responsibly with their data. In turn, those sites can’t be sure that the infrastructure which allows them to monetize are doing the same.
A recent AdWeek interview with Chetna Bindra, Google Senior Product Manager for User Trust, Privacy, and Transparency, gives fresh insight into how one of the world’s biggest data brokers sees the future of privacy. Bindra’s interview is chock-full of interesting nuggets. To her, data privacy is the most significant pain point. It’s also her biggest opportunity for tech companies in the coming years. She says: “We need to find a way for users to continue to access ad-supported content on the web while also feeling confident their privacy is protected… If transparency is a pain point, it’s also an opportunity.”
The point Bindra makes is a crucial concern for us here at Ethyca as well. We believe that previous lax standards around data privacy were a bug, not a feature of the internet era. Now that legislation like the GDPR and CCPA are coming into effect, companies are compelled to focus on operating at a higher standard of transparency around data management, and ultimately –though it may be a short-term challenge to implement. We believe that’s a win for everybody.
Bindra lays out of a vision of how an online ecosystem should work when she says: “Users need to feel like they’re getting value [in exchange for their data] and advertisers need to be able to reach people interested in what they have to offer.”
This point is an argument I make to data regulation skeptics frequently. The fact remains – current ad targeting practices, mainly as large corporations and SMEs increasingly rely on programmatic buys, isn’t anywhere near the platonic ideal of “reaching the motivated consumer when they are likely to purchase.” Moreover, one of the main reasons for that is that a non-regulated data ecosystem that allows for the buying and selling of second- and third-party data sets without users’ affirmative consent is never going to yield as precise targeting models as well-curated, owned, responsibly managed consumer data. The old programming adage GIGO – “Garbage In Garbage Out” – springs to mind.
So, Bindra isn’t utopian. When she speaks this way about the future state of online data privacy, she’s talking about the impact on advertising. The world she describes should be a natural consequence of companies moving from outdated processes of data management. A world where companies are running to the highest globally compliant standard. There’s no need for SMEs feeling intimidated by this prospect. It should be clear that in the long run, better data practice will be good for business.
Published from our Privacy Magazine – To read more, visit privacy .dev
At Ethyca, we believe that software engineers are becoming major privacy stakeholders, but do they feel the same way? To answer this question, we went out and asked 337 software engineers what they think about the state of contemporary privacy… and how they would improve it.
The UK’s new Data Reform Bill is set to ease data privacy compliance burdens on businesses to enable convenience and spark innovation in the country. We explain why convenience should not be the end result of a country’s privacy legislation.
Our team at Ethyca attended the PEPR 2022 Conference in Santa Monica live and virtually between June 23rd and 24th. We compiled three main takeaways after listening to so many great presentations about the current state of privacy engineering, and how the field will change in the future.
For privacy engineers to build privacy directly into the codebase, they need agreed-upon definitions for translating policy into code. Ethyca CEO Cillian unveils an open source system to standardize definitions for personal data living in the tech stack.
Masking data is an essential part of modern privacy engineering. We highlight a handful of masking strategies made possible with the Fides open-source platform, and we explain the difference between key terms: pseudonymization and anonymization.
The American Data Privacy and Protection Act is gaining attention as one of the most promising federal privacy bills in recent history. We highlight some of the key provisions with an emphasis on their relationship to privacy engineering.
Our team of data privacy devotees would love to show you how Ethyca helps engineers deploy CCPA, GDPR, and LGPD privacy compliance deep into business systems. Let’s chat!Book a Demo