The Deep Privacy Challenge of Doing DPIAs Well

The Deep Privacy Challenge of Doing DPIAs Well

 

Data Protection Impact Assessments are the sleeping giants that lie deep in the GDPR. Doing DPIAs well requires organizations to commit to responsible data management at a deep, deep level. That’s one of the reasons why they are so challenging.

DPIAs: Why Do They Get Overlooked?

If one were to poll a sample of business, technical, and marketing professionals on “GDPR provisions that keep you up at night,” it’s likely DPIA’s wouldn’t make the top three. There are flashier aspects of GDPR. Consent management. Right-to-object. Data Subject Requests. Since these are the elements most frequently in the headlines, they tend to take up the most space on a business’s priority list. 

But DPIA’s represent the biggest challenge to most businesses in their present state. And for that reason, establishing a DPIA process that adheres to the GDPR guidelines is a key indicator that a business is making a deep, meaningful commitment to data privacy. 

How Does A DPIA Work, Exactly?

For the uninitiated, here are the basics of a DPIA. It’s intended to let a business analyze and minimize the privacy risk from a processing activity. Under GDPR, businesses conduct a DPIA when undertaking a range of data processing activities, from monitoring public places to using innovative technologies to using biometric data. You can read more about the circumstances in which a DPIA is legally required here. 

The Assessment itself is a multi-step process that involves coordination across a number of teams. The ICO describes the following nine steps as essential:

  • Identify the need for a DPIA
  • Describe the processing
  • Consider consultation (with your Data Protection Officer or relevant authorities)
  • Assess necessity and proportionality
  • Identify and assess risks
  • Identify measures to minimize risk
  • Sign off and record outcomes
  • Integrate outcomes into plan
  • Keep under review.

 

Why DPIAs Are Such A Deep Challenge

The purpose of this article isn’t to walk through the step-by-step of how a DPIA exercise should be conducted. The ICO has already published an excellent one of those here. Rather, it’s to point out what a challenge this poses for most businesses in their present state. Put simply, if most businesses did DPIA’s the way they’re supposed to, it would result in a productivity nightmare. 

Within a modern large business, there could be hundreds of processing activities every year. Under GDPR many will require a DPIA. But the vast majority of businesses lack the processes or technology to perform them quickly; they are handled entirely manually. The result is not pretty. Members of the dev team emailing the legal department to set up a meeting where they present a proposed activity and, together, fill in half of a DPIA template form. Then a question comes up. The legal team consults with enforcement authorities for clarity, and the response takes a week to arrive. Meanwhile, developers are bottlenecked as they’re unsure whether they can proceed until getting clearance from legal. And the marketing team awaiting delivery of their snazzy new retargeting tool is frustrated. Multiply this scenario by a hundred cases a year, and the efficiency costs that a DPIA represents to many organizations becomes clear.

Conclusion: Is “Managed Risk” Actually Manageable?

Given this, it’s not surprising that many businesses opt to take a “managed-risk” view of DPIA’s. Perhaps that represents the best of a bad bunch of options. With a fully manual process, the efficiency cost of compliance can look disastrously high. 

But enforcement around GDPR is picking up. What’s more, consumers are beginning to expect higher standards of privacy practice. As time passes, the cost of DPIA non-compliance will rise steeply. And businesses that decide they can’t afford deep privacy measures today may find the long-run cost of their inaction significantly higher.

 

The Divided States of America(n Data)

The Divided States of America(n Data)

This Is Why We Can’t Have Nice Things – Like Our Own Version of GDPR.

The American Data Divide

Across the ocean, a much-publicized piece of holistic privacy legislation called the GDPR has transformed the relationship between citizens, businesses, and personal data. In 2019 it’s time to ask: why can’t the USA produce its own unified piece of federal data privacy regulation?

Data regulation in the United States is still a work in progress. At present it’s a patchwork quilt split along state and industrial sector lines, and for most consumers, it’s impossible to penetrate. Businesses are similarly hamstrung by the lack of harmonious regulation. Those that decide to play by the rules burn copious resources and frustrating man-hours just to understand what those rules are. And even after that expending all that effort, many (if not most) businesses still struggle to be compliant.

The Roadblocks to Reform

Why can’t Congress do something about it? The short answer is that there just hasn’t been enough momentum to get something passed federally. The FTC has long recommended that Congress enact a comprehensive set of privacy laws. The Obama administration, in its early days, even tabled a set of proposals for a Consumer Privacy Bill of Rights. Privacy practitioners lauded the document. But it quietly died as Silicon Valley ingratiated itself into the D.C. political machine over the first half of the decade. And although the new president is an avid social media user, the Trump administration has shown little appetite for data regulation.

It’s also possible to make a deeper cultural reading into the different data trajectories of the US and EU. The European Union has been, since its inception, a body with the power to legislate dynamically in reaction to the world around it. On the other hand, US legal and political culture remains staunchly Constitutionalist. Legislating for an issue like data privacy, nonexistent at the time the Constitution was written, can be slowed by the challenge of remaining faithful to the spirit of a document that’s over 200 years old.

The Prospects for Change

However, in 2020 there will be a presidential election and possibly a new administration in the White House. Have the dynamics changed sufficiently to inspire another tilt at federal regulation? The voting population seems more concerned than ever about the way companies use personal data. However, a vocal watchdog organization (à la MADD or the NAACP) has yet to emerge. We’ll return to this later.

The real change that’s taken place lies in the business community. Among business leaders, regulatory certainty is emerging as a key concern – even beyond getting favorable laws. Businesses just want the rules of the game to be consistent. And there’s a deeper acceptance that federal laws represent a huge efficiency improvement over the uncertainty and instability of state-by-state regulation. 

One unified piece of legislation would provide a single target on which to concentrate lobbying efforts, debate, and discussion. Consequently, many business leaders are already urging Washington to take action. Earlier this year 51 CEOs from some of the biggest tech and industrial companies in the world signed an open letter to Congress urging them to act on a “comprehensive consumer data privacy law.” 

Will Citizens Step Up?

Were it up to these business leaders, a federal data law would be a done deal. But legislators appear wary of acting while there’s an empty seat at the table. If anything is slowing federal data regulation down in 2019, it’s the lack of a high-profile citizen’s rights group that could sit down with political and business leaders and get the ball rolling.

To conclude, the landscape looks to be more conducive to a federal data privacy law in 2019. But wondering “why doesn’t it exist yet?” may be the wrong question for individual citizens to be asking. In the absence of a highly-invested consumer protection lobby in Washington DC, the correct question to ask may be: “how can we get a seat at the table?”

What’s the Difference Between Data Security & Data Privacy?

What’s the Difference Between Data Security & Data Privacy?

“Data Privacy” and “Data Security” are two terms that can sometimes be used interchangeably. Especially by those who aren’t in the field of data protection. However, in this particular sector of the industry, they mean two very different things. Understanding the relationship between them is essential for grasping the complexity of regulatory compliance. This article is a quick primer that illustrates how privacy and security differ and how they work together as building blocks of regular data operation.

Data Security vs Data Privacy

In simple terms, security means securing data against unauthorized access. Privacy is about managing and defining authorized access. Data security is a technical issue that involves building robust defense mechanisms in your digital infrastructure. Data privacy is questioning and tackling legal and legislative spheres.

One of the most important relationships to note is that data privacy pre-supposes security. The GDPR doesn’t contain prescriptive instructions for how organizations should fortify their network because the only way for its privacy provisions to get followed is with data security. If a cybercriminal steals someone’s PII, it’s evident they are violating someone’s privacy rights.

So, data privacy assumes data security. Does the reverse hold? Does data security include data privacy? No, but organizations fall into the trap of making this assumption often. In so doing, they can avoid taking necessary regulatory compliance steps.

Conclusion

It’s not enough to protect data from outside attacks. Managing and enforcing internal permissions – i.e., managing privacy – is a vital piece of the puzzle for any business to be compliant with the latest data regulation. Internal privacy controls can be complicated and time-consuming in a large company. Something as simple as employees copying files onto personal flash drives can sink a carefully constructed operation. However, the effort to keep data processes watertight is an essential cost of doing business in 2019. Moreover, the cost of failing to invest in both security and privacy can prove disastrous.

An Overview of States Passing Privacy Laws

An Overview of States Passing Privacy Laws

Any Intro to Civics course teaches that lawmakers exist to enact the will of the people. Moreover, since “the people” have recently become very concerned with the security of their data and the privacy of their online activity, it’s perhaps reassuring to see the recent nationwide bloom of state-based digital privacy legislation.

California’s CCPA got the headlines because of the size of the market and the easy comparison to Europe’s GDPR. However, in other states across the country, legislators have quietly passed, or are in the late stages of passing bills that parallel California’s Privacy Law. In some cases, the measures are even more far-reaching. This article examines recent legislative updates in Nevada, New York, Vermont, South Carolina, and Colorado. It demonstrates how privacy regulation is not confining to the West Coast and is very much concern US-wide.

DISCLAIMER: It’s important to note that the landscape is rapidly evolving in the area of privacy regulation. It’s a dynamic, exciting area. So even though what follows is an accurate synopsis of the state of play in late September 2019, don’t be surprised if this list gets dated quickly. As always, this article shouldn’t get interpreted as actual legal advice!

Nevada: Senate Bill 220

Nevada has already passed a new piece of privacy legislation, Senate Bill 220. It will go into effect on October 1, 2019, three months before it’s better-known neighbors enact their CCPA. Many observers believe Nevada’s law is more onerous. It requires a broader range of businesses to offer consumers an opt-out regarding the sale of their personal information. Since it’s going into effect before the CCPA, this will make Senate Bill 220 the first in the U.S. to grant opt-out rights to consumers.

In some aspects, Nevada’s bill is a little more lenient than the CCPA; for instance, it doesn’t add new notice requirements for website owners. However, the per-violation fine amount is $5,000 – twice as high as California’s. So getting privacy wrong in Nevada state lines could prove even more costly to a business.

New York: Stop Hacks and Improve Electronic Security (SHIELD) Act

New York signed the SHIELD Act into law on July 25, 2019, and the bulk of its provisions go into effect on October 23, 2019. The SHIELD Act is more incremental in scope than the other pieces discussed previously. It doesn’t carry any language around opt-out rights, and it’s less concerned with day-to-day online activities. Instead, it focuses on defining and setting processes around actual data breach events.

To this end, the SHIELD Act expands the scope of information subject (to include biometric information) and the scope of possible breach scenarios. It also updates the procedures that companies must follow in the event of a data breach. Lastly, the SHIELD Act creates data security requirements that scale according to the size of the business. This part of the Act goes into effect on March 21, 2020.

Conscious NYPA is dead/on hold right now but probably worth mentioning? HERE is a good summary of the main points that were considered even more aggressive than CCPA and also why it got killed by lobbyists.

Vermont

Vermont became the first state in the union to regulate “data brokers” with a piece of legislation. It came into effect on January 1, 2019. Vermont’s law has a comparatively narrow application. In their case, “data broker” denotes “a business or unit/s of a business, separately or together, that knowingly collects and sells or licenses to third parties the brokered personal information of a consumer with whom the business does not have a direct relationship.” This direct relationship provision means that if a business is, for example, selling directly to consumers online, they’re not bounding by the constraints of this law.

That said, once an entity is considered a data broker, there are quite rigorous processes that must get followed. Data brokers must register annually with the Vermont Secretary of State for a fee of $100 and provide a substantial amount of information to the state regarding the robustness and security of its data operation. Failure to do so can result in a fine up to a maximum of $10,000 per year. 

South Carolina

South Carolina also joined the cohort of states taking data protection into its own hands, with a law that came into effect on January 1, 2019. The South Carolina Insurance Data Security Act is focused on the insurance sector and seeks to establish standards and processes that insurers – deemed licensees – must follow in the event of a cybersecurity breach. 

Licensees are now legally required to formally document a data security program. Upon conducting a thorough audit and risk assessment of their operation, the plan must cover risk management. Additionally, it must cover cybersecurity event investigation and reporting, notification, and ongoing compliance certification. 

Colorado

Lastly, we come to Colorado, which was the very first state to put a signature modern digital privacy law into effect. HB 18-1128 requires organizations to put controls in place for managing PII (including biometric data). The commands needed fall under these broad areas:

  • The storage of PII
  • The destruction of physical and electronic materials that contain PII
  • Investigation and notification in the event of data breaches
  • Liaising with the Colorado attorney general in the investigation and reporting in certain data breach circumstances

Conclusion

This brief overview shows that data privacy isn’t just a concern for businesses operating in California, despite what the news headlines would lead one to believe. Data privacy should be treated as a United States-wide concern for any business, as the trend is very visibly towards state-by-state regulation, each with broad thematic consistency but essential variations in focus and scope. The complexity will only increase as more states get up to speed on the topic. Worth mentioning that state by state is the trend in the short term, but the conversation for a federal law has already started to avoid more complex state by state regulations. 

Published from our Privacy Magazine – To read more, visit privacy .dev

Governments & Privacy

Governments & Privacy

When the words “government” and “privacy” get put side-by-side, the knee-jerk reaction is usually harmful. Since the days of Orwell, governments have been poking their noses into citizens’ business. History suggests the association is not without merit.

Protectors of Privacy Rights

In the last decade, whistleblowers like Edward Snowden have shown the communication boom of the internet era accompanied by an increase in government monitoring and privacy abuses. For example, by the likes of the NSA, the Department of Homeland Security, and other bureaus. A charitable explanation of these practices is that, like many during the era, these actors didn’t fully grasp the full cost and legal implications of the shiny new toys they could access. The less charitable explanation is that they did grasp, but didn’t care enough to stop.

Nevertheless, the truth remains that government institutions are the most important protectors of the digital privacy rights of individual citizens. Businesses must play by the rules that governments make. Also, digital privacy has become a critical governmental concern in recent years. It directly reflects the concerns of the general populace.

A high-profile case in point was Mark Zuckerberg’s congressional testimony in April 2018. Zuckerberg was called in front of Congress to speak on his company’s questionable data practices, particularly relating to the 2016 presidential election. The hearings made two things clear: first, there was a newfound abundance of concern and regulatory intention from the elected officials questioning Mr. Zuckerberg. Second, there was a striking lack of understanding or technical know-how from the same officials. The majority of these legislators are not digital natives, and even if they were, understanding the fine-grains of digital privacy in this day and age requires time and attention to detail that no legislator could realistically afford to spend.

Future-proof Data

At Ethyca, we accept that warts and all, governments are the chief protectors of digital privacy. However, in the fast-moving technology sector, they will always be playing catch up. For SMEs, particularly those that aren’t digital-first, this creates a nightmare scenario of repeated, costly infrastructure overhauls. Doing a one-time, future-proof data infrastructure upgrade is an investment that, over more extended periods, can prove very shrewd indeed.

Published from our Privacy Magazine – To read more, visit privacy .dev

How Online Experience Varies by Purchasing Power

How Online Experience Varies by Purchasing Power

When people discuss issues with data privacy, class ranking is rarely part of the conversation. Even though the internet has been a markedly business-driven project for some years now, the old perception endures that URL life isn’t getting marked by the same dividing lines that mark IRL society. However, this is false. The realization that data privacy gets inextricably tied to economic status is becoming more widely accepted.

Predatory Advertising

As the old technology adage goes: when the product is free, you are the product. Nowhere is this truer than online. Those with less disposable income are prone to having data leveraged in a more aggressive and potentially predatory fashion. Moreso than those who are affluent. Under previous lax data regulation, the robust flow of third-party data meant that advertisers could know with near-certainty the sort of online users that might be vulnerable to risky purchase propositions. In other words, they could target and exploit weak consumers with impunity.

A recent New Republic article highlighted some of the industries that are engaged in predatory online advertising practices. Among the culprits are bookmakers, payday loan companies, and for-profit colleges. It cites author Cathy O’Neil’s claim in the book Weapons of Math Destruction. “A potential student’s first click on a for-profit college website only comes after a vast industrial process has laid the groundwork.”

Advertisers can use anything from Google search history to educational questionnaire data. It data used to target individuals at their moment of peak susceptibility. It’s not that advertisers couldn’t use these techniques to target more affluent consumers. It’s that more affluent consumers are less driven to make such risky purchases, which get often borne from economic desperation. Furthermore, poorer consumers are more likely to have their information washing around ad-targeting databases. It’s because they’re more likely to fork over data for free access. The net result is, in the words of Michael Fertik, “the rich see a different internet than the poor.”

Higher Standards of Privacy

Through this lens, one begins to understand the impact of recent and forthcoming data regulation. It’s not a flat line across classes. It should work to disproportionately decrease the vulnerability of poorer online consumers. Especially because they are the most vulnerable to exploitation in the first place. Governments will continue increasing control over the use of data, and there will be the decreasing ability of companies to license third-party data without consumers’ knowledge. Combine both of those with increased penalties for data processors that violate their rights, and consumers will be less susceptible to predatory advertising and more in control of the data that they hand to companies.

Of course, no one assumes that new data regulation will magically turn profit-seeking enterprises into virtuous pursuers of the highest common good. However, we at Ethyca believe that organizations showing commitment to a higher standard of privacy protection will be rewarded in the long run by increasingly data-savvy consumers. With this in mind, beyond legally-required data practices, we recommend that companies make an effort to spell out all the data processing activities that they undertake on owned properties – to actively educate, in other words. Here at Ethyca, we settled on a “Nutrition Table”-style visualization that we think is crisp and instructive. Got a better idea to keep users informed? Feel free to describe in the comments! 

Published from our Privacy Magazine – To read more, visit privacy .dev

A Framework for Privacy Risk Self-assessment

A Framework for Privacy Risk Self-assessment

With the recent raft of worldwide privacy legislation and much more to come, organizations of all shapes and sizes are becoming forced to evolve the way they do business. Those SMEs that can’t bring their operations into compliance with the GDPR, CCPA and other data privacy laws worldwide will be at a significant competitive disadvantage, and may even find that continued non-compliant operation merely is unsustainable. 

In this “adapt or die” scenario, the essential first step to getting compliant is for SMEs to perform a rigorous self-assessment of their present-state data operation.

There are three basic formats to self-assessment:

  1. Business units can analyze their practices.
  2. Different groups within the agency can review and analyze each other.
  3. A single appointed party can assess each unit in the business.

At Ethyca, we believe in empowering a Data Protection Officer to be a real focal point for all data-related business operations. So if scale permits, we recommend delegating full responsibility for the exercise to a DPO. Of course, each organization’s privacy self-assessment will be inherently different. However, the following aims to provide a framework that will serve as an excellent starting point for any business looking to evaluate its path to data privacy compliance: 

First: Plan the Objective of the Assessment

Is your organization trying to determine whether existing policies ensure regulatory compliance? Deciding the specifics of what to assess is a critical first step. 

Second: Conduct a Personal Information Inventory Check Across All Business Units 

It involves answering the following questions: 

  • What personal information does the business unit collect?
  • How do you collect personal information and in which situations?
  • Why do you collect personal information?
  • Who in the company uses personal information?
  • Who has access to it?
  • Where and how do you store personal information?
  • What methods are used to ensure it is secure?
  • Is it disclosed outside the company? If so, to whom and why is it disclosed?
  • How long is the personal information kept, and when and how is it disposed?

Only by answering these questions can businesses understand the work needed to bring themselves into a state of regulatory compliance. It’s vital to cross-check these answers against provisions in the GDPR, CCPA, and other relevant pieces of regulation by the DPO. Additionally, you should actively cooperate with internal or retained legal counsel proficient in privacy law. The exercise should result in a set of tasks or processes to accomplish to reach the desired level of privacy compliance. 

Last: Review Past Privacy Complaints 

Finally, we recommend reviewing privacy complaints as part of a privacy self-assessment. Especially those that have arisen in the recent past, three years is a sufficient window. It will give you insight into where potential privacy pain points exist between your business and the consumer. That way, you can pay extra attention to these areas as you’re revamping them to be regulation-compliant. So if your organization doesn’t keep logs of such complaints, we’d like to say congratulations! You’ve uncovered another process that needs revamping to survive in the new competitive landscape! 

Published from our Privacy Magazine – To read more, visit privacy .dev

Ethics & Trust in Tech: Thought Leadership

Ethics & Trust in Tech: Thought Leadership

Across the tech sector, there’s widespread consensus that a trust deficit threatens to undermine the current business model of quality, ad-supported content. This mistrust exists between data subjects, data controllers, and data processors (to use GDPR parlance). Users don’t trust that the sites they visit are behaving responsibly with their data. In turn, those sites can’t be sure that the infrastructure which allows them to monetize are doing the same.

A Pain Point / An Opportunity

A recent AdWeek interview with Chetna Bindra, Google Senior Product Manager for User Trust, Privacy, and Transparency, gives fresh insight into how one of the world’s biggest data brokers sees the future of privacy. Bindra’s interview is chock-full of interesting nuggets. To her, data privacy is the most significant pain point. It’s also her biggest opportunity for tech companies in the coming years. She says: “We need to find a way for users to continue to access ad-supported content on the web while also feeling confident their privacy is protected… If transparency is a pain point, it’s also an opportunity.”

The point Bindra makes is a crucial concern for us here at Ethyca as well. We believe that previous lax standards around data privacy were a bug, not a feature of the internet era. Now that legislation like the GDPR and CCPA are coming into effect, companies are compelled to focus on operating at a higher standard of transparency around data management, and ultimately –though it may be a short-term challenge to implement. We believe that’s a win for everybody.

Bindra lays out of a vision of how an online ecosystem should work when she says: “Users need to feel like they’re getting value [in exchange for their data] and advertisers need to be able to reach people interested in what they have to offer.”

From Outdated Process to High Standards

This point is an argument I make to data regulation skeptics frequently. The fact remains – current ad targeting practices, mainly as large corporations and SMEs increasingly rely on programmatic buys, isn’t anywhere near the platonic ideal of “reaching the motivated consumer when they are likely to purchase.” Moreover, one of the main reasons for that is that a non-regulated data ecosystem that allows for the buying and selling of second- and third-party data sets without users’ affirmative consent is never going to yield as precise targeting models as well-curated, owned, responsibly managed consumer data. The old programming adage GIGO – “Garbage In Garbage Out” – springs to mind.

So, Bindra isn’t utopian. When she speaks this way about the future state of online data privacy, she’s talking about the impact on advertising. The world she describes should be a natural consequence of companies moving from outdated processes of data management. A world where companies are running to the highest globally compliant standard. There’s no need for SMEs feeling intimidated by this prospect. It should be clear that in the long run, better data practice will be good for business.

Published from our Privacy Magazine – To read more, visit privacy .dev

How To Assess Vendors For Data Privacy Compliance

How To Assess Vendors For Data Privacy Compliance

When small-to-medium enterprise (SME) team members begin to consider how the business landscape is changing to increased data privacy regulation, the procurement process is not usually high up on their list of answers. However, SMEs focusing too purely on in-house practices miss a key point. Both the GDPR and CCPA place new responsibilities on data controllers. In other words, the company or another body determines the purpose and means of personal data processing. They need to ensure all third-party vendors who touch their data are behaving in a compliant manner. 

In short, the controller continues to hold responsibilities for compliance, even when outsourcing processing duties. The in-house compliance will not suffice. It’s now incumbent on SMEs to ensure that the vendors they work with also adhere to worldwide privacy standards. 

Furthermore, the auditing process should optimally take place upfront in the procurement stage. Contracts signed without the requisite due diligence can be difficult to back out of if it later. Especially if it becomes revealed a third-party vendor is operating in an incompliant fashion. Businesses with deep existing ties to third-party vendors may not be able to start this audit process from a procurement stage. Although, experts highly recommended that existing relationships be revisited and assessed from a compliance perspective. 

With all that said, here are some of the questions that all SMEs should be asking their partners, whether it be during procurement due diligence or in the revisiting of an existing relationship:

First: Does the vendor have a Data Protection Officer?  

Under GDPR, DPOs are now legally required for companies processing large amounts of data. It’s almost a certainty that vendors who specialize in data processing infrastructure are operating at a scale to necessitate a DPO. Failing to cover off on this necessary compliance measure should be a disqualifying red flag in any SME’s procurement process.

Second: How often are the vendor’s policies for storing and processing data on behalf of partners reviewed and updated? 

Data compliance is rapidly changing and continually evolving. A telltale sign that a vendor lacks data privacy rigor is a lack of process for regular policy updates. This field is the opposite of “set it and forget it.” SMEs should be on the lookout for this when auditing vendors for suitability.  

Third: Does the vendor use their sub-processors for the work they do on your behalf? 

If so, what measures have they taken to ensure those entities operate in a compliant fashion? The data privacy chain extends to every processor that runs underneath the data controller umbrella. It includes “partners of partners.” If a vendor has others to help them do their work, they should be able to demonstrate the partners’ compliance. 

Fourth: Does the vendor have tools in place to rapidly identify and communicate a data breach? 

Under the auspices of GDPR and CCPA, data controllers now have a strict obligation to respond to data breaches concerning their data subjects, but if third-party vendors are slow to recognize and report a violation, controllers may have no chance of handling data breaches in a compliant fashion. Thus, reaction and response time is a crucial concern when evaluating a partner for suitability. 

Last: What happens to data ‘subjects’ information at the end of the partnership? 

Without a clear-cut process for erasing subject data in a compliant fashion, it’s a possibility a data controller gets stung by vendor negligence, even after their business relationship has ceased to exist. For this reason, it’s essential to have data sunsetting processes built into third-party agreements upfront. Otherwise, controllers have no legal recourse if vendors mistreat their data after completion of the contract. 

Published from our Privacy Magazine – To read more, visit privacy .dev

How Does Data Privacy Affect My Job?

In technology, change is constant. Professionals working in tech are called on to integrate new processes and ways of thinking to stay abreast of their field. A case in point is data privacy.

If you entered the workforce a decade ago in any number of tech-related tracks, privacy, and processes to protect users was a topic of passing interest. Today, the emergence of GDPRCCPA, and other landmark pieces of legislation has increased data privacy concerns and has become a pivotal part of the development space and beyond. 

This article provides a quick-hit synopsis of how the renewed focus on user data privacy impacts different roles in technology organizations in jurisdictions around the world.

DevOps

Teams that stay compliant incorporate privacy considerations into the development process while simultaneously balancing ongoing pressures for speed and agility. 

The SANS institute suggests several best practices that DevOps teams can do to continue working efficiently. Here are the most crucial.

  • Streamline access control. Ensure that only authorized users can access sensitive information. Session management tools like tokens and timeouts should be used to protect against unwanted access.
  • Error handling & logging. Store data logs securely and track all administrative activity, as well as all inbound and outbound data processing activities.
  • Practice continuous integration. Build authentication, password management, and other security features into code. In addition, incorporate automated security scanning into the delivery process.

UX

The fundamental principle that has emerged in the UX space is “privacy by design.” 

In the 1990s, Dr. Ann Cavoukian developed these seven principles and embedded data privacy features in the very fabric of a software product. GDPR framers regarded Dr. Cavoukian’s so much that they made “privacy by design” a foundational tenet of their legislation. 

Listed below are the seven principles of privacy by design, and UX professionals must now incorporate them into their work.

  • Proactive, not reactive: preventative, not remedial
  • Privacy as the default setting
  • Privacy embedded in the design
  • Full functionality: positive-sum, not zero-sum
  • End-to-end security: full lifecycle protection
  • Visibility and transparency: keep it open
  • Respect for user privacy: keep it user-centric

Product

Product managers are more responsible than most for ensuring their organization heeds new privacy regulation. Above all, they are responsible for product quality. If that product is running in a non-compliant way, it’s undoubtedly a defective product. 

Fortunately, product managers have resources across the organization to ensure they are staying up-to-date with privacy reform. In her guide to GDPR Mastery for Product Managers, Karen Cohen runs through a set of clearly defined organizational processes that should be employed to protect from privacy violations:

  • Work closely with legal teams. It’s their responsibility to understand the regulations and how they might impact your product. It’s the product manager’s job to translate their opinions into actionable steps for different stakeholders in the business.
  • Researching and comparing domain knowledge is essential. Also, so is competitor research. How are other businesses in your sector handling information access requests? What do their opt-in and opt-outs look like on-site? These are breadcrumbs your company can follow on a path to privacy success.
  • Establish clear ownership. If you have a big complicated product, a single person can’t have granular privacy oversight throughout the system. That’s why product managers need to establish clear roles and areas of ownership, as well as a structure of command to help support the activities of the GDPR-mandated Data Protection Officer. Building this company infrastructure from scratch is undoubtedly a challenge, but in the long-term, this purposeful delegation will beat the ad-hoc process every time. Above all, it makes your business less vulnerable to breaches and violations.

Published from our Privacy Magazine – To read more, visit privacy .dev