What’s the Difference Between Data Security & Data Privacy?

What’s the Difference Between Data Security & Data Privacy?

“Data Privacy” and “Data Security” are two terms that can sometimes be used interchangeably. Especially by those who aren’t in the field of data protection. However, in this particular sector of the industry, they mean two very different things. Understanding the relationship between them is essential for grasping the complexity of regulatory compliance. This article is a quick primer that illustrates how privacy and security differ and how they work together as building blocks of regular data operation.

Data Security vs Data Privacy

In simple terms, security means securing data against unauthorized access. Privacy is about managing and defining authorized access. Data security is a technical issue that involves building robust defense mechanisms in your digital infrastructure. Data privacy is questioning and tackling legal and legislative spheres.

One of the most important relationships to note is that data privacy pre-supposes security. The GDPR doesn’t contain prescriptive instructions for how organizations should fortify their network because the only way for its privacy provisions to get followed is with data security. If a cybercriminal steals someone’s PII, it’s evident they are violating someone’s privacy rights.

So, data privacy assumes data security. Does the reverse hold? Does data security include data privacy? No, but organizations fall into the trap of making this assumption often. In so doing, they can avoid taking necessary regulatory compliance steps.

Conclusion

It’s not enough to protect data from outside attacks. Managing and enforcing internal permissions – i.e., managing privacy – is a vital piece of the puzzle for any business to be compliant with the latest data regulation. Internal privacy controls can be complicated and time-consuming in a large company. Something as simple as employees copying files onto personal flash drives can sink a carefully constructed operation. However, the effort to keep data processes watertight is an essential cost of doing business in 2019. Moreover, the cost of failing to invest in both security and privacy can prove disastrous.

An Overview of States Passing Privacy Laws

An Overview of States Passing Privacy Laws

Any Intro to Civics course teaches that lawmakers exist to enact the will of the people. Moreover, since “the people” have recently become very concerned with the security of their data and the privacy of their online activity, it’s perhaps reassuring to see the recent nationwide bloom of state-based digital privacy legislation.

California’s CCPA got the headlines because of the size of the market and the easy comparison to Europe’s GDPR. However, in other states across the country, legislators have quietly passed, or are in the late stages of passing bills that parallel California’s Privacy Law. In some cases, the measures are even more far-reaching. This article examines recent legislative updates in Nevada, New York, Vermont, South Carolina, and Colorado. It demonstrates how privacy regulation is not confining to the West Coast and is very much concern US-wide.

DISCLAIMER: It’s important to note that the landscape is rapidly evolving in the area of privacy regulation. It’s a dynamic, exciting area. So even though what follows is an accurate synopsis of the state of play in late September 2019, don’t be surprised if this list gets dated quickly. As always, this article shouldn’t get interpreted as actual legal advice!

Nevada: Senate Bill 220

Nevada has already passed a new piece of privacy legislation, Senate Bill 220. It will go into effect on October 1, 2019, three months before it’s better-known neighbors enact their CCPA. Many observers believe Nevada’s law is more onerous. It requires a broader range of businesses to offer consumers an opt-out regarding the sale of their personal information. Since it’s going into effect before the CCPA, this will make Senate Bill 220 the first in the U.S. to grant opt-out rights to consumers.

In some aspects, Nevada’s bill is a little more lenient than the CCPA; for instance, it doesn’t add new notice requirements for website owners. However, the per-violation fine amount is $5,000 – twice as high as California’s. So getting privacy wrong in Nevada state lines could prove even more costly to a business.

New York: Stop Hacks and Improve Electronic Security (SHIELD) Act

New York signed the SHIELD Act into law on July 25, 2019, and the bulk of its provisions go into effect on October 23, 2019. The SHIELD Act is more incremental in scope than the other pieces discussed previously. It doesn’t carry any language around opt-out rights, and it’s less concerned with day-to-day online activities. Instead, it focuses on defining and setting processes around actual data breach events.

To this end, the SHIELD Act expands the scope of information subject (to include biometric information) and the scope of possible breach scenarios. It also updates the procedures that companies must follow in the event of a data breach. Lastly, the SHIELD Act creates data security requirements that scale according to the size of the business. This part of the Act goes into effect on March 21, 2020.

Conscious NYPA is dead/on hold right now but probably worth mentioning? HERE is a good summary of the main points that were considered even more aggressive than CCPA and also why it got killed by lobbyists.

Vermont

Vermont became the first state in the union to regulate “data brokers” with a piece of legislation. It came into effect on January 1, 2019. Vermont’s law has a comparatively narrow application. In their case, “data broker” denotes “a business or unit/s of a business, separately or together, that knowingly collects and sells or licenses to third parties the brokered personal information of a consumer with whom the business does not have a direct relationship.” This direct relationship provision means that if a business is, for example, selling directly to consumers online, they’re not bounding by the constraints of this law.

That said, once an entity is considered a data broker, there are quite rigorous processes that must get followed. Data brokers must register annually with the Vermont Secretary of State for a fee of $100 and provide a substantial amount of information to the state regarding the robustness and security of its data operation. Failure to do so can result in a fine up to a maximum of $10,000 per year. 

South Carolina

South Carolina also joined the cohort of states taking data protection into its own hands, with a law that came into effect on January 1, 2019. The South Carolina Insurance Data Security Act is focused on the insurance sector and seeks to establish standards and processes that insurers – deemed licensees – must follow in the event of a cybersecurity breach. 

Licensees are now legally required to formally document a data security program. Upon conducting a thorough audit and risk assessment of their operation, the plan must cover risk management. Additionally, it must cover cybersecurity event investigation and reporting, notification, and ongoing compliance certification. 

Colorado

Lastly, we come to Colorado, which was the very first state to put a signature modern digital privacy law into effect. HB 18-1128 requires organizations to put controls in place for managing PII (including biometric data). The commands needed fall under these broad areas:

  • The storage of PII
  • The destruction of physical and electronic materials that contain PII
  • Investigation and notification in the event of data breaches
  • Liaising with the Colorado attorney general in the investigation and reporting in certain data breach circumstances

Conclusion

This brief overview shows that data privacy isn’t just a concern for businesses operating in California, despite what the news headlines would lead one to believe. Data privacy should be treated as a United States-wide concern for any business, as the trend is very visibly towards state-by-state regulation, each with broad thematic consistency but essential variations in focus and scope. The complexity will only increase as more states get up to speed on the topic. Worth mentioning that state by state is the trend in the short term, but the conversation for a federal law has already started to avoid more complex state by state regulations. 

Published from our Privacy Magazine – To read more, visit privacy .dev

Governments & Privacy

Governments & Privacy

When the words “government” and “privacy” get put side-by-side, the knee-jerk reaction is usually harmful. Since the days of Orwell, governments have been poking their noses into citizens’ business. History suggests the association is not without merit.

Protectors of Privacy Rights

In the last decade, whistleblowers like Edward Snowden have shown the communication boom of the internet era accompanied by an increase in government monitoring and privacy abuses. For example, by the likes of the NSA, the Department of Homeland Security, and other bureaus. A charitable explanation of these practices is that, like many during the era, these actors didn’t fully grasp the full cost and legal implications of the shiny new toys they could access. The less charitable explanation is that they did grasp, but didn’t care enough to stop.

Nevertheless, the truth remains that government institutions are the most important protectors of the digital privacy rights of individual citizens. Businesses must play by the rules that governments make. Also, digital privacy has become a critical governmental concern in recent years. It directly reflects the concerns of the general populace.

A high-profile case in point was Mark Zuckerberg’s congressional testimony in April 2018. Zuckerberg was called in front of Congress to speak on his company’s questionable data practices, particularly relating to the 2016 presidential election. The hearings made two things clear: first, there was a newfound abundance of concern and regulatory intention from the elected officials questioning Mr. Zuckerberg. Second, there was a striking lack of understanding or technical know-how from the same officials. The majority of these legislators are not digital natives, and even if they were, understanding the fine-grains of digital privacy in this day and age requires time and attention to detail that no legislator could realistically afford to spend.

Future-proof Data

At Ethyca, we accept that warts and all, governments are the chief protectors of digital privacy. However, in the fast-moving technology sector, they will always be playing catch up. For SMEs, particularly those that aren’t digital-first, this creates a nightmare scenario of repeated, costly infrastructure overhauls. Doing a one-time, future-proof data infrastructure upgrade is an investment that, over more extended periods, can prove very shrewd indeed.

Published from our Privacy Magazine – To read more, visit privacy .dev

How Online Experience Varies by Purchasing Power

How Online Experience Varies by Purchasing Power

When people discuss issues with data privacy, class ranking is rarely part of the conversation. Even though the internet has been a markedly business-driven project for some years now, the old perception endures that URL life isn’t getting marked by the same dividing lines that mark IRL society. However, this is false. The realization that data privacy gets inextricably tied to economic status is becoming more widely accepted.

Predatory Advertising

As the old technology adage goes: when the product is free, you are the product. Nowhere is this truer than online. Those with less disposable income are prone to having data leveraged in a more aggressive and potentially predatory fashion. Moreso than those who are affluent. Under previous lax data regulation, the robust flow of third-party data meant that advertisers could know with near-certainty the sort of online users that might be vulnerable to risky purchase propositions. In other words, they could target and exploit weak consumers with impunity.

A recent New Republic article highlighted some of the industries that are engaged in predatory online advertising practices. Among the culprits are bookmakers, payday loan companies, and for-profit colleges. It cites author Cathy O’Neil’s claim in the book Weapons of Math Destruction. “A potential student’s first click on a for-profit college website only comes after a vast industrial process has laid the groundwork.”

Advertisers can use anything from Google search history to educational questionnaire data. It data used to target individuals at their moment of peak susceptibility. It’s not that advertisers couldn’t use these techniques to target more affluent consumers. It’s that more affluent consumers are less driven to make such risky purchases, which get often borne from economic desperation. Furthermore, poorer consumers are more likely to have their information washing around ad-targeting databases. It’s because they’re more likely to fork over data for free access. The net result is, in the words of Michael Fertik, “the rich see a different internet than the poor.”

Higher Standards of Privacy

Through this lens, one begins to understand the impact of recent and forthcoming data regulation. It’s not a flat line across classes. It should work to disproportionately decrease the vulnerability of poorer online consumers. Especially because they are the most vulnerable to exploitation in the first place. Governments will continue increasing control over the use of data, and there will be the decreasing ability of companies to license third-party data without consumers’ knowledge. Combine both of those with increased penalties for data processors that violate their rights, and consumers will be less susceptible to predatory advertising and more in control of the data that they hand to companies.

Of course, no one assumes that new data regulation will magically turn profit-seeking enterprises into virtuous pursuers of the highest common good. However, we at Ethyca believe that organizations showing commitment to a higher standard of privacy protection will be rewarded in the long run by increasingly data-savvy consumers. With this in mind, beyond legally-required data practices, we recommend that companies make an effort to spell out all the data processing activities that they undertake on owned properties – to actively educate, in other words. Here at Ethyca, we settled on a “Nutrition Table”-style visualization that we think is crisp and instructive. Got a better idea to keep users informed? Feel free to describe in the comments! 

Published from our Privacy Magazine – To read more, visit privacy .dev

A Framework for Privacy Risk Self-assessment

A Framework for Privacy Risk Self-assessment

With the recent raft of worldwide privacy legislation and much more to come, organizations of all shapes and sizes are becoming forced to evolve the way they do business. Those SMEs that can’t bring their operations into compliance with the GDPR, CCPA and other data privacy laws worldwide will be at a significant competitive disadvantage, and may even find that continued non-compliant operation merely is unsustainable. 

In this “adapt or die” scenario, the essential first step to getting compliant is for SMEs to perform a rigorous self-assessment of their present-state data operation.

There are three basic formats to self-assessment:

  1. Business units can analyze their practices.
  2. Different groups within the agency can review and analyze each other.
  3. A single appointed party can assess each unit in the business.

At Ethyca, we believe in empowering a Data Protection Officer to be a real focal point for all data-related business operations. So if scale permits, we recommend delegating full responsibility for the exercise to a DPO. Of course, each organization’s privacy self-assessment will be inherently different. However, the following aims to provide a framework that will serve as an excellent starting point for any business looking to evaluate its path to data privacy compliance: 

First: Plan the Objective of the Assessment

Is your organization trying to determine whether existing policies ensure regulatory compliance? Deciding the specifics of what to assess is a critical first step. 

Second: Conduct a Personal Information Inventory Check Across All Business Units 

It involves answering the following questions: 

  • What personal information does the business unit collect?
  • How do you collect personal information and in which situations?
  • Why do you collect personal information?
  • Who in the company uses personal information?
  • Who has access to it?
  • Where and how do you store personal information?
  • What methods are used to ensure it is secure?
  • Is it disclosed outside the company? If so, to whom and why is it disclosed?
  • How long is the personal information kept, and when and how is it disposed?

Only by answering these questions can businesses understand the work needed to bring themselves into a state of regulatory compliance. It’s vital to cross-check these answers against provisions in the GDPR, CCPA, and other relevant pieces of regulation by the DPO. Additionally, you should actively cooperate with internal or retained legal counsel proficient in privacy law. The exercise should result in a set of tasks or processes to accomplish to reach the desired level of privacy compliance. 

Last: Review Past Privacy Complaints 

Finally, we recommend reviewing privacy complaints as part of a privacy self-assessment. Especially those that have arisen in the recent past, three years is a sufficient window. It will give you insight into where potential privacy pain points exist between your business and the consumer. That way, you can pay extra attention to these areas as you’re revamping them to be regulation-compliant. So if your organization doesn’t keep logs of such complaints, we’d like to say congratulations! You’ve uncovered another process that needs revamping to survive in the new competitive landscape! 

Published from our Privacy Magazine – To read more, visit privacy .dev

Ethics & Trust in Tech: Thought Leadership

Ethics & Trust in Tech: Thought Leadership

Across the tech sector, there’s widespread consensus that a trust deficit threatens to undermine the current business model of quality, ad-supported content. This mistrust exists between data subjects, data controllers, and data processors (to use GDPR parlance). Users don’t trust that the sites they visit are behaving responsibly with their data. In turn, those sites can’t be sure that the infrastructure which allows them to monetize are doing the same.

A Pain Point / An Opportunity

A recent AdWeek interview with Chetna Bindra, Google Senior Product Manager for User Trust, Privacy, and Transparency, gives fresh insight into how one of the world’s biggest data brokers sees the future of privacy. Bindra’s interview is chock-full of interesting nuggets. To her, data privacy is the most significant pain point. It’s also her biggest opportunity for tech companies in the coming years. She says: “We need to find a way for users to continue to access ad-supported content on the web while also feeling confident their privacy is protected… If transparency is a pain point, it’s also an opportunity.”

The point Bindra makes is a crucial concern for us here at Ethyca as well. We believe that previous lax standards around data privacy were a bug, not a feature of the internet era. Now that legislation like the GDPR and CCPA are coming into effect, companies are compelled to focus on operating at a higher standard of transparency around data management, and ultimately –though it may be a short-term challenge to implement. We believe that’s a win for everybody.

Bindra lays out of a vision of how an online ecosystem should work when she says: “Users need to feel like they’re getting value [in exchange for their data] and advertisers need to be able to reach people interested in what they have to offer.”

From Outdated Process to High Standards

This point is an argument I make to data regulation skeptics frequently. The fact remains – current ad targeting practices, mainly as large corporations and SMEs increasingly rely on programmatic buys, isn’t anywhere near the platonic ideal of “reaching the motivated consumer when they are likely to purchase.” Moreover, one of the main reasons for that is that a non-regulated data ecosystem that allows for the buying and selling of second- and third-party data sets without users’ affirmative consent is never going to yield as precise targeting models as well-curated, owned, responsibly managed consumer data. The old programming adage GIGO – “Garbage In Garbage Out” – springs to mind.

So, Bindra isn’t utopian. When she speaks this way about the future state of online data privacy, she’s talking about the impact on advertising. The world she describes should be a natural consequence of companies moving from outdated processes of data management. A world where companies are running to the highest globally compliant standard. There’s no need for SMEs feeling intimidated by this prospect. It should be clear that in the long run, better data practice will be good for business.

Published from our Privacy Magazine – To read more, visit privacy .dev

How To Assess Vendors For Data Privacy Compliance

How To Assess Vendors For Data Privacy Compliance

When small-to-medium enterprise (SME) team members begin to consider how the business landscape is changing to increased data privacy regulation, the procurement process is not usually high up on their list of answers. However, SMEs focusing too purely on in-house practices miss a key point. Both the GDPR and CCPA place new responsibilities on data controllers. In other words, the company or another body determines the purpose and means of personal data processing. They need to ensure all third-party vendors who touch their data are behaving in a compliant manner. 

In short, the controller continues to hold responsibilities for compliance, even when outsourcing processing duties. The in-house compliance will not suffice. It’s now incumbent on SMEs to ensure that the vendors they work with also adhere to worldwide privacy standards. 

Furthermore, the auditing process should optimally take place upfront in the procurement stage. Contracts signed without the requisite due diligence can be difficult to back out of if it later. Especially if it becomes revealed a third-party vendor is operating in an incompliant fashion. Businesses with deep existing ties to third-party vendors may not be able to start this audit process from a procurement stage. Although, experts highly recommended that existing relationships be revisited and assessed from a compliance perspective. 

With all that said, here are some of the questions that all SMEs should be asking their partners, whether it be during procurement due diligence or in the revisiting of an existing relationship:

First: Does the vendor have a Data Protection Officer?  

Under GDPR, DPOs are now legally required for companies processing large amounts of data. It’s almost a certainty that vendors who specialize in data processing infrastructure are operating at a scale to necessitate a DPO. Failing to cover off on this necessary compliance measure should be a disqualifying red flag in any SME’s procurement process.

Second: How often are the vendor’s policies for storing and processing data on behalf of partners reviewed and updated? 

Data compliance is rapidly changing and continually evolving. A telltale sign that a vendor lacks data privacy rigor is a lack of process for regular policy updates. This field is the opposite of “set it and forget it.” SMEs should be on the lookout for this when auditing vendors for suitability.  

Third: Does the vendor use their sub-processors for the work they do on your behalf? 

If so, what measures have they taken to ensure those entities operate in a compliant fashion? The data privacy chain extends to every processor that runs underneath the data controller umbrella. It includes “partners of partners.” If a vendor has others to help them do their work, they should be able to demonstrate the partners’ compliance. 

Fourth: Does the vendor have tools in place to rapidly identify and communicate a data breach? 

Under the auspices of GDPR and CCPA, data controllers now have a strict obligation to respond to data breaches concerning their data subjects, but if third-party vendors are slow to recognize and report a violation, controllers may have no chance of handling data breaches in a compliant fashion. Thus, reaction and response time is a crucial concern when evaluating a partner for suitability. 

Last: What happens to data ‘subjects’ information at the end of the partnership? 

Without a clear-cut process for erasing subject data in a compliant fashion, it’s a possibility a data controller gets stung by vendor negligence, even after their business relationship has ceased to exist. For this reason, it’s essential to have data sunsetting processes built into third-party agreements upfront. Otherwise, controllers have no legal recourse if vendors mistreat their data after completion of the contract. 

Published from our Privacy Magazine – To read more, visit privacy .dev

How Does Data Privacy Affect My Job?

In technology, change is constant. Professionals working in tech are called on to integrate new processes and ways of thinking to stay abreast of their field. A case in point is data privacy.

If you entered the workforce a decade ago in any number of tech-related tracks, privacy, and processes to protect users was a topic of passing interest. Today, the emergence of GDPRCCPA, and other landmark pieces of legislation has increased data privacy concerns and has become a pivotal part of the development space and beyond. 

This article provides a quick-hit synopsis of how the renewed focus on user data privacy impacts different roles in technology organizations in jurisdictions around the world.

DevOps

Teams that stay compliant incorporate privacy considerations into the development process while simultaneously balancing ongoing pressures for speed and agility. 

The SANS institute suggests several best practices that DevOps teams can do to continue working efficiently. Here are the most crucial.

  • Streamline access control. Ensure that only authorized users can access sensitive information. Session management tools like tokens and timeouts should be used to protect against unwanted access.
  • Error handling & logging. Store data logs securely and track all administrative activity, as well as all inbound and outbound data processing activities.
  • Practice continuous integration. Build authentication, password management, and other security features into code. In addition, incorporate automated security scanning into the delivery process.

UX

The fundamental principle that has emerged in the UX space is “privacy by design.” 

In the 1990s, Dr. Ann Cavoukian developed these seven principles and embedded data privacy features in the very fabric of a software product. GDPR framers regarded Dr. Cavoukian’s so much that they made “privacy by design” a foundational tenet of their legislation. 

Listed below are the seven principles of privacy by design, and UX professionals must now incorporate them into their work.

  • Proactive, not reactive: preventative, not remedial
  • Privacy as the default setting
  • Privacy embedded in the design
  • Full functionality: positive-sum, not zero-sum
  • End-to-end security: full lifecycle protection
  • Visibility and transparency: keep it open
  • Respect for user privacy: keep it user-centric

Product

Product managers are more responsible than most for ensuring their organization heeds new privacy regulation. Above all, they are responsible for product quality. If that product is running in a non-compliant way, it’s undoubtedly a defective product. 

Fortunately, product managers have resources across the organization to ensure they are staying up-to-date with privacy reform. In her guide to GDPR Mastery for Product Managers, Karen Cohen runs through a set of clearly defined organizational processes that should be employed to protect from privacy violations:

  • Work closely with legal teams. It’s their responsibility to understand the regulations and how they might impact your product. It’s the product manager’s job to translate their opinions into actionable steps for different stakeholders in the business.
  • Researching and comparing domain knowledge is essential. Also, so is competitor research. How are other businesses in your sector handling information access requests? What do their opt-in and opt-outs look like on-site? These are breadcrumbs your company can follow on a path to privacy success.
  • Establish clear ownership. If you have a big complicated product, a single person can’t have granular privacy oversight throughout the system. That’s why product managers need to establish clear roles and areas of ownership, as well as a structure of command to help support the activities of the GDPR-mandated Data Protection Officer. Building this company infrastructure from scratch is undoubtedly a challenge, but in the long-term, this purposeful delegation will beat the ad-hoc process every time. Above all, it makes your business less vulnerable to breaches and violations.

Published from our Privacy Magazine – To read more, visit privacy .dev

What is the CCPA? A Guide to California Privacy Law

What is the CCPA? A Guide to California Privacy Law

Introduction: What is the CCPA?

The California Consumer Privacy Act will come into effect on January 1, 2020. This fact may have a significant impact on your business. 

California is the crown jewel in the United States economy. If it were a standalone country, its $2.7 trillion GDP would be the fifth-largest in the world, sitting ahead of the United Kingdom. Combined with the state’s status as an incubator for tech innovation and consumer culture, California gives outsized importance for all kinds of businesses operating at local, national, and multinational levels.

The CCPA forces enterprises reaching a particular scale to contend. Other states will soon follow suit with similar legislative pieces of their own. California has long been a bellwether for US-wide tech legislation. 

This examines the CCPA piece-by-piece, analyzing business impact, with particular attention given to the consequences for Small-to-Medium Enterprise (SME) ‘s data management, systems, and practices. 

The conclusion should clarify the CCPA is nothing to fear for management and development teams. Teams that are proactive and thoughtful in adapting to CCPA prescriptions will get ahead in successfully achieving compliance. For those that don’t use the appropriate amount of care, the consequences can be severe.

Getting Started: What is the Scope of CCPA?

Reading through the CCPA is quite a different exercise to reading through the GDPR. For context, the GDPR is another major piece of consumer data protection legislation to emerge in recent times. The language of the GDPR is clear and its structure is logical. Contrastly, the grammar of the CCPA is a dense “legalese”. The structure of the Act skips from one area to another without a consistent thread. 

The CCPA is a series of builds or amendments to previously existing pieces of legislation. Compared to the GDPR, which was an attempt to craft a comprehensive data protection policy from scratch. The upshot is that it’s most sensible to analyze the CCPA under topic groupings rather than from top to bottom. The first topic is essential to consider is scope: Whom does the CCPA apply? There’s a host of ways business are subject to CCPA requirements. 

How to Determine If your Business Qualifies

Regardless of the amount of data you collect, do you have gross revenue over $25 million? Then the CCPA applies to you. However, if you’re not operating at that scale and still collect, buy, or sell the personal information of over 50,000 people, households, or devices per year, then the CCPA also applies to you. If a business doesn’t process that amount of personal information, but still earns more than half of yearly revenue (no matter what number) from selling consumers’ data, then the CCPA is applicable. Of course, your company must also have a business presence in the state of California, because that’s as far as the legislation’s power extends.

Personal Data

“Personal Data” is the second scope-related question in the CCPA. Whereas other pieces of data legislation take an umbrella-view of defining what constitutes personal data, the CCPA attempts to spell out in more explicit detail the types of information that count. The list here is extensive and worth comprehensive review, but a pivotal point to realize is that the CCPA covers information that links to households as well as individuals.

In effect, this means that certain information which would not be protected under other pieces of legislation because they can’t be associated with an individual. For example, TV viewing records or non-individual linked purchase behavior data are considered personal data under the CCPA because they are linked to a household.

Digging Deeper: What are the Intentions of the CCPA?

Once you address the question of scope, it’s possible to begin examining the intentions of the California Consumer Privacy Act and, at macro-level, the measures it takes to achieve those intentions. 

Section 2 of the Act explicitly outlines the aim of this piece of legislation – empowering citizens of California to:

  • Know what personal data is organizations are collecting about them
  • Know if/when personal information is sold or disclosed and to whom
  • Say “no” to the sale of their data
  • Access the personal data that an organization has collected about them
  • Obtain equal service and price from companies collecting personal data. 

Right away a development team or project manager tasked with architecting their SME’s data infrastructure should see that these aims if adequately supported by the legislation, carry far-reaching consequences for how businesses build their data management systems. 

The old notion of siloing company data is dead. Data is no longer an organic mass. It is information continuously added and subtracted through interaction with company employees and product consumers. Businesses will face real challenges with being and staying CCPA-compliant. They need flexibility and agility built into the architecture, including collection and storage to retrieval and analysis.

Business Obligations: CCPA’s Impact on the Data Landscape

Given the objectives stated, what are the concrete steps businesses must take to avoid running afoul of the CCPA? Here’s a list a summary of the most important:

Companies must be able to disclose to a requesting consumer the categories and specific pieces of personal information that the business has collected.

Businesses must have both a clearly-signposted Method for consumers to lodge a request for information and a streamlined system for disaggregating an individual’s data from their database. They need to deliver it in a timely fashion too. It’s worth noting that a business is obligated to provide this information up to two times in twelve months. Though it may seem self-evident, this means a system is needed to track Information Requests so that one individual doesn’t overly burden the system. 

Consider that even some businesses operating at scale don’t possess a system for request intake nor keep a single-location record of information requests. In this scenario, it’s entirely feasible that a single individual could take up far more valuable staff time than legally necessary through repeated information requests.

These are easy solves when considered upfront but can be challenging if retrofitted only when the problem becomes evident. An additional requirement of this capability is that the delivery of this data must be free and in reasonably consumable form, which means that businesses can’t charge a consumer to receive a record of their data record, and they also can’t present that data in some arcane file format that the consumer will have difficulty decoding. 

All in all, this requirement could lead to significant business impact for companies that are not already up to speed on current best practices for data management.

At or before the point of data collection, businesses are required to inform consumers of the categories of personal information they intend to collect and the purpose for using specific types of personal information.

For any SME operating on “highest common denominator” principles, this will be no surprise. After all, this is already a requirement under GDPR law. It’s reasonable to expect that as the world follows in the footsteps of the CCPA and GDPR, upfront disclosure of data collection will become a standard legal procedure. 

In practice, this can have a range of implications for a company’s customer experience on- and offline, such as:

  • A pop-up box for consent to cookies
  • An opt-in screen before a user enters the purchase funnel 
  • Changes to the purchase experience in physical store locations that are passively collecting data on in-store customer behavior

Under CCPA law, some forms of personal information are protected that can’t be tied directly to an individual. To fully understand how this CCPA requirement could change the way a company does business, an in-depth audit will often be necessary.

Lastly, businesses and marketers collecting information on consumers need to be able to wipe out that information entirely upon request.

Not only that but in many cases, the business must be able to direct related service providers who utilize this info to wipe it also. It does not matter whether the data has been sold as part of a second-party set or shared as part of a service-delivery process; the requirement stands. 

This obligation demonstrates businesses operating under CCPA jurisdiction have no choice but to end antiquated “data-silo” operations. Especially those that made ongoing alterations to a data store difficult and time-consuming. Businesses will have to ensure their partners and data clients have this same capability. They can be held liable for a partner’s failure to remove records from a database.

What are the Costs for Violating CCPA?

Of course, the CCPA couldn’t hope to be a compelling piece of privacy legislation without effective enforcement mechanisms to keep companies honest. What are the consequences for organizations that run afoul of their prescriptions? To put it straight, they can add up quickly.

Penalities

A person, business, or service provider found in violation of the CCPA is subject to a court injunction. They are also liable for a civil penalty of up to $2,500 per unintentional violation and $7,500 per intentional violation. 

The critical thing to remember is that for companies dealing with large amounts of personal data, violations likely won’t number in the tens, hundreds, or even thousands of customers. A systemic violation of CCPA provisions can quickly put a six-digit multiplier on the $2,500 or $7,500 fine. 

Civil Suit

For many SME’s, this could prove a high enough number to sink them entirely. That’s not all. Apart from civil liability, consumers can bring action of up to $750 per incident. Plus, the value of personal damages. A business failing to simply notify their consumers they’re collecting web data can quickly find themselves looking down the barrel of a damaging class-action civil suit.

In essence, the CCPA is a piece of legislation that takes data protection seriously. It has the enforcement clout to make businesses take it seriously too. The bill becomes the law of the land on January 1, 2020. Companies with a footprint in California have approximately six months as of the time of writing to ensure they’re not at risk for severe financial penalties. 

First Steps: How Should Teams Prepare for the New Data Landscape?

Time to take action! What are the steps that teams should take now? Let’s examine some of the critical steps any business can take to prepare.

Conduct a Review of Existing Data Architecture.

If you’re a typical SME preparing for what lies ahead, your first step is to comprehensively review data operations. Prepare data maps, inventories, and other records to catalog. Include all points of collection, storage, retrieval, and exploitation of personal information relating to California-based consumers. Only through this exercise can a business accurately plan for the changes needed to be CCPA-compliant.

Consider more than California-only web/mobile/business models

For companies operating at a global scale, we recommend adopting a highest-common denominator approach to a full data architecture redesign. It future-proofs operations, saving time and money due to decreased need for bespoke solutions based on territory. 

For companies with a smaller footprint; however, it may be worthwhile to examine building California-specific consumer experiences. Your SME can decide on the best business option by following the previously mentioned systematic audit of current data operations.

Ensure there are available online and offline methods for submitting Data Access Requests

The CCPA requires companies to consider their relationship with consumers. The CCPA mandates a toll-free number dedicated to submitting data access requests, so businesses ensure their intake system isn’t online-only.

Provide a Clear “Do Not Sell My Personal Information” Option on web properties

It’s another non-negotiable requirement of the CCPA. California citizens or those authorized to represent them must be able to designate that their data is not for sale. A user who selects this option can’t suffer a diminished experience if they don’t want their data sold. In contrast, the GDPR does allow companies to alter their experience if customers don’t want their data monetized.

Plan New Systems That Can Perform The Following Functions

  • Verify the identity of individuals who request data access or data deletion
  • Respond to requests for data access or deletion within 45 days
  • Determine the age of a California resident. Companies must obtain parental consent for data collection for users under 13. If they don’t have a way to determine the user’s age, they can be held liable for disregarding this obligation.

Conclusion

If this seems like a significant amount of work, it’s because it is.

Since its inception, the Internet has been a relatively lawless environment regarding consumer protection. Now the days of the Internet as a Wild West are genuinely drawing to a close. Just like in the physical world, businesses that wish to profit must follow the rules or face the consequences. Luckily with the proper foresight and attention, CCPA compliance can be a straightforward exercise that doesn’t break the balance sheet.

Published from our Privacy Magazine – To read more, visit Privacy.dev

GDPR Fully Explained

GDPR Fully Explained

With the European Union’s passage of the General Data Protection Regulation (GDPR), the practice of data regulation moved out of its infancy. GDPR is the first wide-reaching piece of unified data and privacy policy in the world, heavily regulating a plethora of rules that are set to follow in its wake. 

Apart from the occasional headline about FAANG companies tussling with the new legislation, the practical impact of GDPR remains obscure. If you’re a stakeholder in a small-to-medium enterprise (SME), this is a big problem. Unlike Google and Facebook, SMEs are unlikely to have a bottomless legal budget to contest being found in violation of the GDPR. As a result, data compliance over the next five to ten years can quickly become a question of business survival. 

This guide is a starting point for understanding the implications GDPR has for these businesses. Let’s examine the document, chapter by chapter, to summarize its content and analyze the practical consequences for companies seeking compliance. 

1. Understanding the Key Terms

First, The GDPR begins by outlining the scope and subjects of its regulation. Chapter 1 covers Articles 1-4 of the document. 

The two most important points to note from this section are where it applies and to whom. The territory where the GDPR applies to data processing by operating within the EU, even if the actual processing occurs outside the EU. It also applies to organizations based outside the EU that are offering goods and services to individuals inside the EU.

Controllers & Processors

To whom does it correctly apply? The GDPR applies to two parties: Data Controllers and Data Processors

A Controller is a party that determines the purposes and means of personal data processing. For example, a beer company that doesn’t build commercial software but has a website that gathers users’ birth dates is a data controller. The processor is the party that processes or operates on personal data – data on behalf of the controller. 

Continuing our previous example, the entity our hypothetical beer company subcontracts to is a Processor. It’s because they are building the beer brand website. Note that GDPR still binds data controllers even if they are using an independent Processor related to data collection, storage, or processing.

Finally, GDPR seeks to regulate information which constitutes personal data. Personal data is information that must relate to an identifiable individual. Determining whether information “relates” to an individual is an exercise in judgment. One must consider both the content of the information and the purpose of processing such data. For most SME’s, it is advisable to err on the side of caution. Treat any piece of user information, even if pseudonymized, as personal data unless explicitly advised otherwise by appropriate legal counsel.

2. Learning the Core Principles and Business Implications

Second, are the GDPR’s foundational principles, covered in articles 6-11. At the core of the GDPR is the provision that data collection must be lawful, fair, and transparent. Lawful, in this case, has two implications.

First, a business must proactively identify a lawful basis for collecting and processing user data. You cannot “shoot first and ask questions later.” Moreover, it must determine that the consequences of that processing are lawful. If a company has a legal basis for processing user data but uses it to do something illegal, then they violate the GDPR.

Informed Consent

The lawfulness principle expands in article 6, listing a myriad of conditions under which data processing can be considered lawful. “Informed Consent” is an essential requirement to be aware of. The principle under which many companies derive a legal basis for collecting data on their users.

Informed consent requires specific and unambiguous conditions. As a practical example, an online form with consent options as an opt-out selected by default violates the GDPR because it’s not unambiguous. The implications of informed consent are significant. 

Development and UX teams must work to structure their online data collection forms in a way that balances clean experience with legal compliance. Organizations can build natural ways for consent to be withdrawn at any time. If users can’t remove consent as quickly as they give it, then it doesn’t meet the GDPR requirements. A typical example of this requirement is the pop-up box requiring users to consent to the use of cookies on a company’s website. Now ubiquitous, these are direct results of GDPR requirements.

Fairness and Transparency

Fairness and transparency are the value-driven counterparts to “lawful.” Under the tenets of the GDPR, an organization must go beyond pure legal compliance, showing they have considered the impact of user data processing and found it justifiable. Orgs need open and honest approaches to data processing. Orgs also need to comply with requests from data subjects regarding their data, or the “right to be informed.”

What does this mean for an SME? It means the lawful, fair, and transparent collection of data doesn’t happen on an ad hoc basis. Organizations collecting user data must proactively examine each category of data they want to collect and evaluate whether it is consistent with the fundamental principles of the GDPR. 

Organizations can ensure systems are in place to signpost (when and how) data is being collected to meet the transparency requirement. They must also receive and respond to requests from their users regarding personal data processing. 

More Core Principles

Compliant development teams are mindful of the following core principles: 

  • Purpose Limitation. You must limit your data collection to data that serves your intended purpose and explain it to the user in plain English.
  • Data minimization. You must keep the data collected to a minimum for serving your intended purpose. You can’t collect data on the “off chance” that it serves your purpose. It must be explicit and necessary for your use.
  • Storage Limitation. There’s a time component to purpose limitation, which requires that organizations must not store personal data for beyond the time needed to complete an intended purpose. This seemingly small requirement has significant implications for business is done. Data can’t just be stored in perpetuity once collected; teams must build systems for the periodic purging of data and the re-obtaining of affirmative consent at regular intervals.

3. Understanding the Rights of the Data Subject

Having outlined the core principles, Articles 12-23 deal specifically with the rights of the data subject. Many of these rights stem directly from the need for lawful, fair, and transparent data collection. As we see in Chapter 3, these considerations take new and significant territory. 

It is fair to say that the rights conferred to the data subject in this section have the most substantial impact. Especially on how SMEs build data infrastructure. Basically, businesses are preparing to liaise with data subjects regarding their data. They make certain kinds of corrective action to the data residing in their systems.

Right to Access

Chapter 3 stipulates that citizens have a right to access their personal data information and see how controllers are processing that data. Practically, Data processors must have mechanisms in place to quickly and comprehensively share an individual’s data with them if they request. Therefore, a business with a massive “data lake” of consumer information violates the GDPR if it can’t efficiently pull and distribute individual records. 

Right to Erasure and Rectification

Chapter 3 confers additional rights on the Data subject, including the all-important Right to Erasure and Right to Rectification. These are safeguards to protect citizens even if their data has been captured lawfully, justly, and transparently. Rectification means that organizations must be able to correct inaccurate information about a data subject at the data subject’s request. Additionally, the Right to Erasure implies that a business must be able to provably delete all data related to a given individual if required to do so by request or otherwise. These conditions point to the need for reliable infrastructure supporting necessary capture and processing capabilities. 

Data Portability

Data Portability is less discussed in most media but equally impactful for individual business and the way they manage data. Article 20 of the GDPR stipulates that controllers must make data available to subjects in a “structured, commonly used, machine-readable format.” What this means for a small business is that if a Subject Access Request (SAR) comes in, the company needs to be able to turn around a response in a directly transferable format quickly. With this in mind, the artifact can’t be a printout or even a PDF. It’s more likely to be a file in CSV or JSON format that’s easily portable and can be opened and interpreted on the average citizen’s computer.

Furthermore, a business consideration that stems from the fluid requirements for data hosting is around building systems that are agile enough to respond to constant updating and extraction of data-sets. Development teams have to think carefully about requirements regarding data schemas and the versioning and specification of those schemas in the case of frequent changes.

4. Exploring the Obligations of Controllers and Processors

This chapter of the GDPR is chock-full of information with necessary business implications, and spans 19 articles, making it the lengthiest section of the GDPR. 

Here are the key points to take out if you’re dipping your toes into the data protection waters:

Data Protection by Design and Default

Addressed in Article 25 is a core data management system under GDPR. What it means in principle is that organizations are obligated to take “appropriate” measures when collecting, to store, and processing data. In practice, this means that privacy-by-design engineering is now a vital consideration for any dev team. Depending on the size of your team, a dedicated privacy engineer may or may not be feasible, but in any case, responsibility for privacy considerations must be delegated and prioritized among team members. 

Other measures that may be considered appropriate, taking circumstances into account may be pseudonymization of data, encryption of data, and system routine security checks. With these safeguards in place, the ability to notify relevant parties of a data breaches should be straightforward. However, the GDPR goes far in codifying the obligatory response time for each party. 

Organizations must notify the data subject immediately if there is a breach of their data. They must inform the relevant supervisory authority within 72 hours too. Has your business run a fire-drill to train for data breach response? If not, it should have! At the moment, GDPR’s requirements mean that no time can be lost aligning on the process.

Data Protection Officer

Lastly, Chapter 4 describes the role of a Data Protection Officer (DPO). A DPO is becoming increasingly common among data-dependent businesses. Nevertheless, if your business relies on processing large amounts of data (i.e., online behavior tracking), you’re required to appoint someone to this position. While the exact threshold for an obligatory DPO is still being hashed out via GDPR-related rulings, we recommend that businesses get serious about data management. Proactively recruit for this position.

5. Understanding the Transfer of Data to Third Countries and International Organizations

Chapter five of the GDPR provides additional detail on data transfers when it involves parties outside or above EU jurisdiction. If a business seeks to transfer data to one of these parties, specific steps are taken, then sanctioned under GDPR. Namely, “appropriate safeguards” and vetting of the third-party organization with the relevant EU supervisory authorities. In the absence of a positive green light from those authorities, transfers are permissible if proven that the appropriate safeguards get put in place. 

Chapter 5 states that companies need to follow data protection best practices inside and outside of EU jurisdiction. GDPR ensures all data emanating outward from European-supervised entities gets transferred with due caution and security of data subject rights.

6-11. Understanding the Additional Detail Contained in the Remaining

The structure of the GDPR document outlines most of the key terms, concepts, and prescriptions in the first five chapters. The back half of the regulation paper is less concerned with introducing new ideas and more concerned with firming up processes of compliance, enforcement, and sanctions related to GDPR compliance. Nevertheless, in this part of the document, there are essential points to note due to tangible business impact. 

Establish a Supervisory Authority

Chapter 6 calls for the establishment of at least one supervisory authority in each European Member state. Authorities monitor and enforce GDPR compliance in a given country and businesses in that country submit annual reports proving GDPR compliance. SME’s, therefore, should look to incorporate streamlined reporting capabilities as part of their data operation. Chapter 7 describes in further detail how these supervisory authorities are to cooperate and work together to promote EU-wide GDPR compliance.

Penalities

Chapter 8 of the GDPR breaks down compliance processes and penalties imposed by failing to comply with GDPR rules. We recommend that all critical stakeholders in SME data operations read through these articles in detail. Does your business need more convincing of the unique and financially significant consequences of taking the GDPR lightly? Then remember, GDPR violations can result in fines of up to 4% of the business’s global turnover (per annum). Consequently, this can turn into billions of dollars, as recent GDPR cases involving the FAANG companies have demonstrated. Forewarned, forearmed!

Outstanding Business Items

Finally, Chapters 9-11 results in a final tidy up of outstanding items of business, including some discussion on exceptional data cases and adoption of different member state data measures. Development teams or other SME stakeholders do not need to focus on this part of the document. Especially when they’ll need to work so hard to process and incorporate all of the detailed instruction that has come before.

Conclusion

In conclusion, the GDPR is a significant and wide-ranging piece of legislation that will have a big impact on the business and technology landscape. Though the many implications of the document may seem daunting if you’ve made it to the end of this paper: congratulations. You’re now significantly better informed on the steps you need to take to get data compliant. Now it’s time to round up key players in your business –developers, management, marketing teams, and more – and start to gameplan for the changes that lie ahead.

Published from our Privacy Magazine – To read more, visit Privacy.dev