Why is California voting on a new privacy law already? It’s a story about human optimism, tech companies determined to operate within grey areas, and of course, plenty of confusing acronyms.
Back in 2018, California lawmakers hurried to pass the CCPA, a new regulation about privacy and data compliance. This year, Californian voters faced another privacy-related choice on the ballot. So, why is there a new law on the table so soon after implementation of a similar one? The story behind California’s new data privacy law is about human optimism, philosophical battles over the right to privacy, tech companies determined to operate within grey areas, and of course, plenty of confusing acronyms.
On November 3rd, 2020, California voters decided to vote CPRA into law. It will go into effect in January 2023 (with a lookback provision starting in 2022) and have major data compliance implications for all companies that do business with California consumers. To better understand how we got to CPRA and what it could mean, let’s review a brief history of the last few years of data privacy in California.
The Golden State is the spiritual home of software and the internet. It’s no secret that Silicon Valley took a cavalier attitude towards ideas of user privacy in the early days of the web. There was simply no governing body producing and enforcing privacy laws suitable for modern digital networks in the 1990’s and early 2000’s.
As lawmakers and consumers began to get wiser, and as more concern arose over the negative consequences of tech titans’ ability to leverage consumer data, new regulations were tabled to help protect user privacy. But legislative language is notoriously difficult to get right. It requires highly specific legalese to ensure that the intended outcomes are actually achievable.
This helps explain the first major privacy law to go into effect in California : the California Consumer Privacy Act. It came into effect on January 1, 2020, and became legally enforceable on July 1. Although many bright minds worked on the language, it satisfied nearly no one. Behind the scenes of this initial legislation is a contentious journey that neatly illustrates the difficulties and opportunities for passing comprehensive data privacy law in the US.
Alastair Mactaggart is a real estate developer in California who became interested in privacy and data management in 2016. Like many others, he first learned about the issues from a consumer perspective. He was concerned about the ways in which his online behavior was being tracked and monetized without his consent. So, he decided to do something about it.
With little policy experience or knowledge of the tech industry, Mactaggart enlisted some neighbors to help craft language for a new policy about data privacy and data management in the state of California. Some of them, like Mary Stone Ross, happened to be seasoned political veterans and privacy observers. So the makeshift team of policy lobbyists were able to craft an extensive privacy law proposal that, due to widespread public support of any privacy initiative, stood a good chance of passing into law.
This effort by Mactaggart alarmed California lawmakers – even those that were pro-privacy. One, state Senator Bob Hertzberg, was particularly alarmed at how difficult it would be to amend Mactaggart’s version of the law. Said Hertzberg in Wired: ““The reason we thought it was horrible wasn’t because he didn’t do a lot of good things that were consumer-facing; of course he did. But he put a 70 percent threshold. And in my world, a 70 percent threshold basically gives the other party all the power.”
The California Consumer Privacy Act (CCPA), then, was ultimately Hertzberg and California’s attempt to appease privacy activists…without putting Mactaggart’s proposal on the ballot. It was hurried through legislature and full of expedient compromise. This can be seen in the way it was received by, well, pretty much everyone.
The CCPA didn’t meet the brief for consumer usability, nor provide enough clarity to companies. It placed a disproportionate burden on smaller businesses, who didn’t have the legal know-how to work the law’s loopholes in the same way as some of the world’s biggest tech companies. Neither side could really get behind the legislation. Plus, there were major questions about enforcement; that burden was placed on the California Attorney General, who were open in their admission that resource constraints would only let them prosecute a small number of CCPA violations per year.
In the end, Bob Hertzberg himself began urging Mactaggart to put an initiative to people’s vote. This would, in his view, provide a stronger mandate for a robust privacy and suitable mechanisms for regulatory enforcement. Proposition 24, aka the California Privacy Rights Act (CPRA) is the result.
Now, Californians have the opportunity to expand the original CCPA into the CPRA. This change will see a strong focus on enforcement as well as finer-grained classification and requirements for businesses that process personal data. Does this mean that all the CCPA’s issues have been resolved? Far from it! Many observers (experts included) are still confused about the new legislation and the specific ways it differs from the previous CCPA.
The 52-page document includes very technical language that makes it difficult for everyday citizens to understand. One reason to vote for it is that it’s a strong stand for consumer privacy that doesn’t exist anywhere else in the country. On the other hand, it’s not clear how the CPRA would play out in real life. Will it expand consumer privacy protections or diminish them? Even the original drafters of the CCPA, Mactaggart and Mary Stone Ross, disagree on this question.
The passage of the CPRA could set a nationwide precedent for similar data compliance laws. In the absence of a federal privacy law, it will become the de facto privacy standard for the entire United States. But it’s yet to be seen if the enforcement mechanism will work as envisioned.
The only thing that’s certain is that the story of US digital privacy law is at the start, not the end. This is the beginning of a years-long conversation about the role of data in business and how to balance consumer privacy with the cost of doing business. Compliance can be tricky at the best of times, but it becomes a lot easier when you automate the core tasks with a product like Ethyca.
At Ethyca, we believe that software engineers are becoming major privacy stakeholders, but do they feel the same way? To answer this question, we went out and asked 337 software engineers what they think about the state of contemporary privacy… and how they would improve it.
The UK’s new Data Reform Bill is set to ease data privacy compliance burdens on businesses to enable convenience and spark innovation in the country. We explain why convenience should not be the end result of a country’s privacy legislation.
Our team at Ethyca attended the PEPR 2022 Conference in Santa Monica live and virtually between June 23rd and 24th. We compiled three main takeaways after listening to so many great presentations about the current state of privacy engineering, and how the field will change in the future.
For privacy engineers to build privacy directly into the codebase, they need agreed-upon definitions for translating policy into code. Ethyca CEO Cillian unveils an open source system to standardize definitions for personal data living in the tech stack.
Masking data is an essential part of modern privacy engineering. We highlight a handful of masking strategies made possible with the Fides open-source platform, and we explain the difference between key terms: pseudonymization and anonymization.
The American Data Privacy and Protection Act is gaining attention as one of the most promising federal privacy bills in recent history. We highlight some of the key provisions with an emphasis on their relationship to privacy engineering.
Our team of data privacy devotees would love to show you how Ethyca helps engineers deploy CCPA, GDPR, and LGPD privacy compliance deep into business systems. Let’s chat!Book a Demo