Connecticut has just enacted its consumer privacy law. We look at the unique provisions of the legislation and how Privacy-as-Code can help teams future-proof their privacy ops more generally.
On May 10, 2022, Connecticut officially became the fifth U.S. state to pass a comprehensive modern consumer privacy law. Known as the Connecticut Data Privacy Act (CTDPA), this bill will take effect on July 1, 2023—the same day as Colorado’s CPA. To keep track of the growing list of privacy acronyms, bookmark our regularly updated Data Privacy Acronyms List.
In this article, we summarize what stands out in the CTDPA and how it compares with other existing privacy laws. But why stop with a legal analysis? As strong proponents of the power of privacy engineering, we’ll show how an organization can employ open-source Privacy-as-Code tools to future-proof its privacy operations amidst an evolving regulatory landscape.
The CTDPA closely resembles Colorado’s law but also contains elements from California’s and Virginia’s privacy laws. Notably, this law also allows for joint enforcement between California and Colorado. In the absence of unified federal privacy standards, we’re seeing groups of states work together to try to harmonize enforceable privacy standards.
Privacy lawyer David Stauss said that “Through joint enforcement, ‘The three Cs’ of state privacy law [California, Colorado, and Connecticut] will be in a unique position to dictate the future of U.S. privacy law, assuming the continuing absence of a federal law.”
This new law applies to entities that conduct business within the state, or businesses that target customers in Connecticut. Moreover, the bill applies to entities that
The Connecticut Attorney General’s office, responsible for CTDPA enforcement, has built a reputation in the past decade for its privacy focus. It has been at the forefront of dedicating resources to privacy units, and has led the way in health privacy enforcement.
One of the measures that stands out from the CTDPA is its limit on the cure period. Privacy laws like Virginia’s CDPA allow entities 30 days to address alleged privacy violations after they have been notified by the Attorney General – a “cure period”. However, the CTDPA’s cure period will last only from July 1, 2023 to December 31, 2024. During that time, businesses will have 60 days to fix any infringements of the CTDPA. After that, it is up to the Attorney General’s office to determine how to proceed with companies’ non-compliance. This hard deadline will force companies to adopt compliance practices before 2025.
Unlike the CPA, the CTDPA includes a definition on what constitutes biometric data. It states that any photographic, audio, or video data that identifies an individual is considered to be biometric data. These guidelines impose strict limitations on what businesses are allowed to do with the biometric consumer data they collect.
The CTDPA is seen as codifying some of the most consumer-friendly opt-out options thus far in U.S. state privacy law. Individuals in Connecticut can opt out of automated profiling for sales, targeted ads, and employment. Universal opt-outs also apply to financial, housing, insurance, education, criminal justice, or healthcare services. Universal opt-outs are required to start by January 1, 2025. An exception to this rule is the ability for companies to collect data to complete transactions.
This law also sets measures for consumers’ rights to delete their data. For applicable data that is supplied by a source other than the consumer, companies must fulfill one of the following options:
The CTDPA also provides protections for consumers from data brokers or third-party entities that collect their data without consent or permission. Under this new law, if a company has obtained data from a source other than the customer themself, they still must comply with a data deletion request and erase any data relating to the identity of the requester, regardless of how the data was obtained.
Similar to Colorado’s and Virginia’s laws, the CTDPA imposes strict regulations on how entities collect the data of minors. It requires parental consent to collect the personal data of a child known to be under 13 years old.
Connecticut goes further than Colorado and Virginia with its child data protection measures.
When it comes to teen privacy, the CTDPA states that companies shall not “process the personal data of a consumer for purposes of targeted advertising, or sell the consumer’s personal data without the consumer’s consent, under circumstances where a controller has actual knowledge, and willfully disregards, that the consumer is at least thirteen years of age but younger than sixteen years of age.”
As previously mentioned, Connecticut is the fifth U.S. state to pass a modern privacy law in the last few years; the patchwork quilt of privacy regulation in the United States keeps expanding. Thus in addition to parsing the details of Connecticut’s freshly signed privacy law, it’s worth taking a broader view on how to prepare for upcoming privacy regulations at large. In the context of privacy engineering and the diversifying regulatory landscape, the concept of future-proofing is invaluable.
Future-proofing applies to infrastructure, both digital and physical. In a 2016 paper “The Principles of Future-Proofing: A Broader Understanding of Resiliency in the Historic Built Environment,” the architecture and preservation specialist Brian Rich gives a widely applicable definition of future-proofing.
Future proofing is “the process of anticipating the future and developing methods of minimizing the effects of shocks and stresses due to future events.”
Future-proofing is not some impossible game of trying to predict every future event. Instead, future-proofing involves applying our current knowledge to build infrastructure that can withstand unpredictable changes in the environment.
When we talk about Privacy-as-Code, what we mean is this: developing explicit descriptions of privacy attributes in code environments, to enable tailored PII controls. When clear descriptions of data categories and data uses live alongside the code itself, it becomes feasible to enact specific controls on the data according to today’s needs—and to modify the controls when regulatory circumstances inevitably evolve. Proactive privacy is not a fixed achievement but an ongoing commitment.
An example from Connecticut’s new privacy law can help illustrate the importance of future-proofing. We will focus on the law’s requirements for personal data deletion, specified in the earlier section. To recap, a company with personal data gathered from a source other than the consumer must respect a deletion request in one of the following ways:
Suppose that your company had not previously been held to such a requirement because all of your users live in the eastern United States, and the only other consumer privacy law in the region—Virginia’s CDPA—does not have these conditions on personal data deletion. However, as part of your preparation for Virginia’s privacy law, you had developed a descriptive layer for your data infrastructure, where you described the what and why of the PII contained therein: the state of each individual with records in your databases, whether the data was user-provided or derived, and the business use for that data. Then you have the information needed to know and thus control your company’s processing activities.
With the data context and control established to meet this data deletion requirement, you can then inform your team to take the appropriate action. For instance, you might advise the company to follow the latter option: opt the consumer out of processing that involves this derived personal data. From there, your data engineers could implement a procedure to efficiently yet comprehensively comply with the requirement.
In particular, the procedure might apply to any records associated with a user whose address includes the state “CT,” where that can be determined. Then, if and only if the records are derived, the records should be designated as prohibited from any processing for any use case.
The procedure described in this section takes basic descriptors of personal data—whether it is derived or user-provided, the coarse geographic location of the corresponding individual, and categories of the data’s intended usage—to craft a rule to fit an upcoming privacy law. Put another way, descriptors that are not in themselves new can be used to create efficient procedures to comply with new policy needs. By creating a layer of governable privacy metadata within the data stack, engineering and legal teams can effectively future-proof their privacy ops in the face of ever-evolving regulatory requirements.
Note that a full implementation of the procedure in this section would likely be a hybrid of the CTDPA’s deletion options, and it would also need to abide by the nuanced exemptions for personal data under the law. For instance, protected health information protected under HIPAA is exempt. Privacy-as-Code calls for legal and engineering knowledge working in tandem to build processes that are technically efficient and legally comprehensive.
Regarding the example discussed above, an engineering team using Fides could move with agility in response to the passage of a new state privacy law, updating their subject request orchestration policies (written in the Fides language) to accommodate for bespoke requirements of Connecticut residents, while retaining pre-existing erasure strategies for individuals who reside elsewhere.
Through an extensible and open-source taxonomy, Fides enables companies to describe aspects like data categories and data uses directly alongside the code and databases themselves. In doing so, companies can future-proof their tech stacks. Clone the open-source Fides repo and get started with future-proofing.
Connecticut residents can welcome enhanced privacy protections for their digital lives, and in the absence of a federal privacy law, more U.S. residents can expect that state-level privacy rights are coming their way very soon. On the business side, laws like the CTDPA emphasize the need for business privacy solutions that are agile and extensible instead of a “black box.” The Fides privacy engineering platform is a great, free way for engineering and legal teams to start making comprehensive data control a “by design” feature of their infrastructure.
Ethyca’s VP of Engineering Neville Samuell recently spoke at the University of Texas at Austin’s Texas McCombs School of Business about privacy engineering and its role in today’s digital landscape. Read a summary of the discussion by Neville himself here.
Learn more about all of the updates in the Fides 2.24 release here.
Ethyca’s Senior Software Engineer Adam Sachs goes through the thought process of creating Fideslang, the privacy engineering taxonomy that standardizes privacy compliance in software development.
Learn more about all of the updates in the Fides 2.23 release here.
Our Senior Software Engineer Dawn Pattison walks you through implementing data minimization into your business.
Learn more about all of the updates in the Fides 2.22 release here.
Our team of data privacy devotees would love to show you how Ethyca helps engineers deploy CCPA, GDPR, and LGPD privacy compliance deep into business systems. Let’s chat!Request a Demo