Archive for the ‘95 directive’ Category

The EU-US Privacy Shield – A “New Deal” for Safe Harbor?

Posted on February 2nd, 2016 by

At the Democratic National Convention in Chicago in 1932, as America seemed endlessly trapped within the depths of its Great Depression, Governor Franklin D. Roosevelt accepted his party’s nomination to run for President and promised the American people this:

I pledge you, I pledge myself, to a new deal for the American people.

This pledge – to lift the American people out of the economic troughs they had endured for years – helped Governor Roosevelt achieve office and become the next President of the United States.  Over the coming years, measures taken by Roosevelt under his “New Deal” program helped take the United States out of the Great Depression and restore it to economic glory.

This piece of history has obvious parallels with the news announced by the European Commission today that it has agreed a “new framework” (admittedly, not quite as catchy as a “New Deal”) with the United States for transatlantic data flows: US data exports have been in crisis since the Snowden revelations, the new framework promises to significantly benefit ‘the man on the street’, and this agreement is widely perceived as critical to US businesses and the US economy.

The effort taken to achieve this new framework has been simply monumental and, taken at face value, it’s cause for celebration.  But, as any lawyer will tell you, the devil is in the detail and today is only really part of the story…

What does the new framework provide?

To begin with, the EU and US have agreed a rebrand – Safe Harbor 2.0 will instead be called the “EU-US Privacy Shield.”  Critics will undoubtedly say that a “rose by any other name…” (or perhaps, less poetically, that “if it walks like a duck and talks like a duck…”), but the Commission has been eager to emphasize that the new framework has significant differences from the existing Safe Harbor.

In fact, the Commission takes great care in its press release not to even mention Safe Harbor, except to reference it very briefly for historical context purposes.  Announcing the EU-US Privacy Shield, Justice Commissioner Jourová said:

“The new EU-US Privacy Shield will protect the fundamental rights of Europeans when their personal data is transferred to U.S. companies. For the first time ever, the United States has given the EU binding assurances that the access of public authorities for national security purposes will be subject to clear limitations, safeguards and oversight mechanisms. Also for the first time, EU citizens will benefit from redress mechanisms in this area. In the context of the negotiations for this agreement, the US has assured that it does not conduct mass or indiscriminate surveillance of Europeans. We have established an annual joint review in order to closely monitor the implementation of these commitments.”

Like Roosevelt’s New Deal which was built upon the “three Rs” (relief, recovery, and reform), so too is the EU-US Privacy Shield built upon three core goals:

1.  Strong obligations on companies handling Europeans’ personal data and robust enforcement: U.S. companies wishing to import personal data from Europe will need to commit to robust obligations on how personal data is processed and individual rights are guaranteed. The Department of Commerce will monitor that companies publish their commitments, which makes them enforceable under U.S. law by the US. Federal Trade Commission. In addition, any company handling human resources data from Europe has to commit to comply with decisions by European DPAs. 

2.  Clear safeguards and transparency obligations on U.S. government access: For the first time, the US has given the EU written assurances that the access of public authorities for law enforcement and national security will be subject to clear limitations, safeguards and oversight mechanisms. These exceptions must be used only to the extent necessary and proportionate. The U.S. has ruled out indiscriminate mass surveillance on the personal data transferred to the US under the new arrangement. To regularly monitor the functioning of the arrangement there will be an annual joint review, which will also include the issue of national security access. The European Commission and the U.S. Department of Commerce will conduct the review and invite national intelligence experts from the U.S. and European Data Protection Authorities to it. 

3.  Effective protection of EU citizens’ rights with several redress possibilities: Any citizen who considers that their data has been misused under the new arrangement will have several redress possibilities. Companies have deadlines to reply to complaints. European DPAs can refer complaints to the Department of Commerce and the Federal Trade Commission. In addition, Alternative Dispute resolution will be free of charge. For complaints on possible access by national intelligence authorities, a new Ombudsperson will be created.

But what does this mean in practice?

Well, don’t break out the champagne just yet!

Reacting to today’s news, Jan Albrecht, the German MEP who fronted the European Parliament’s negotiations on the new General Data Protection Regulation was quick to criticize on Twitter:

Listen carefully: and US side will need ‘some’ weeks to get this into concrete legal wording. This is no ‘deal’! ” (Tweet available here:

Even Edward Snowden weighed in:

It’s not a ‘Privacy Shield,’ it’s an accountability shield. Never seen a policy agreement so universally criticized.” (Tweet available here:

Grumbling from critics aside, the bigger point to note is this: before any data transfers can take place under the new EU-US Privacy Shield, the European Commission first has to adopt a formal ‘adequacy’ decision (as it has done in the past for the old Safe Harbor and for model clauses).  It’s working on that now but, even before that can happen, it has to take advice from the Article 29 Working Party – and it’s probably a fair assumption that some members of the Working Party are less than charitably disposed towards any kind of US data transfers.

What also remains unclear is the status of current Safe Harbor certified companies – will they automatically be transitioned into the new EU-US Privacy Shield?  The commercial will to see this happen will be strong but, if the scheme is to succeed in achieving any kind of credibility, it’s difficult to see how this can really happen in practice – grandfathering in businesses under a discredited data transfer framework won’t do wonders to win over critics of the new framework.

Put simply: you’re not going to be able to rely on the EU-US Privacy Shield for data transfers for some time yet.  So don’t plan to do so.

Should we build our future data export strategy on the new Privacy Shield?

This really is a tough question to answer.  While a political and legal solution may have been found, at the end of the day that matters little if no one uses it.

And that’s the single biggest problem the EU-US Privacy Shield has to overcome.  Given that detail about the new Privacy Shield is scarce (limited pretty much to what’s been explained above); given that civil liberties group are almost certainly bound to challenge the EU-US Privacy Shield pretty much straightaway; and given that the CJEU Schrems ruling handed national DPAs the ability to investigate the ‘adequacy’ of data transfers made under any new Commission adequacy findings – including this new kid on the block – then you have to ask the question: why would anyone want to use it?

Over the past 4 months, US companies have invested huge amounts of time and effort to transition their data exports over to model clauses from Safe Harbor.  Typically, the pressure to do so has been customer-led, with EU customers insisting that their US suppliers use model clauses if those suppliers want their business.  The reality is that the way businesses use data hasn’t changed, whether EU or US based; only the paperwork under which they use it.  The concern hasn’t been about surveillance, or better protection for data, or anything like that – it’s been about keeping the wheels of commerce turning.

With that in mind, and having invested all this effort to transition over to a new data export model (often necessitating securing significant budget from senior managers), why would US businesses wish to transition over again to the new EU-US Privacy Shield?  Especially if, after doing so, EU customers still refuse to accept it due to concerns that it may be challenged by data subjects or DPAs?  Ask yourself this: as an EU customer, would you accept a US supplier using the EU-US Privacy Shield without also providing some kind of ‘backup’ solution in the form of model clauses?

So no matter how much effort has been put into agreeing this framework for the EU-US Privacy Shield, the biggest challenge is yet to come: market acceptance, and there’s a real ‘hearts and minds’ campaign that needs to be staged here to win over the doubters.  Without this, the EU-US Privacy Shield may find itself consigned to become nothing more than an interesting footnote in data export history.

But, to end on an optimistic note, there is one positive development: the Privacy Shield at least has the same spelling in the EU and the US.  EU privacy lawyers who have spent years cursing at their computers as their word processing software automatically corrects their spelling of “harbor” to “harbour” can at last breathe a sign of relief…

Why you shouldn’t rely on consent for (most) data transfers.

Posted on January 13th, 2016 by

Some time back, I did a short post on LinkedIn warning my contacts not to rely on consent as a basis for legalizing their data transfers from the EU.  The post generated such an overwhelming response – both from those who agreed, and also from a few who disagreed, that I felt it merited slightly longer explanation here.  So here goes:

The EU data export prohibition

As all readers of this blog will know, the EU Data Protection Directive prohibits transfers of personal data out of the European Economic Area (Art 25(1)) unless the transferring organization either:

(a) transfers the data to a territory deemed to provide ‘adequate’ protection by the European Commission;

(b) can show that a data export exemption applies, or

(c) puts in place a lawful data export solution with the recipient of the data (i.e. model clauses or BCR).

To date, only a handful of non-EEA countries have been declared adequate by the European Commission (i.e. point (a) above), and a list of those countries is available here.

This means that for most data exporting organizations, they are looking either to show that a data export exemption applies (point (b) above) or that they have a data export solution in place (point (b) above).

Data export ‘exemptions’

Exemptions from the data export prohibition are set out in Article 26(1) of the Directive and – you can probably guess what’s coming – these exemptions include the data subject’s ’unambiguous’ consent.  Put another way, if your data subject understands that you are transferring his or her data internationally, the (potential) consequences of this international transfer, and unambiguously indicates his or her agreement, then the transfer is exempt from the data transfer prohibition.

Sounds good, doesn’t it – but hold your horses.  The lawyers among you will know that legislative exemptions are, by their nature, meant to be construed very narrowly.  That means they should not be ‘the first port of call’ you turn to, but rather applied only when other, more protective, options have been exhausted.  (More on this below.)

Not only that, but the requirement for ‘unambiguous’ consent is a very high threshold to satisfy.  Simply ‘burying’ consent language within a privacy policy isn’t good enough – the consent needs to be presented in a way that is presented much more prominently to data subjects (and who ever saw an international data transfers tick box on a website?)  Remember too that, in certain types of controller/data subject relationships (e.g. an employer/employee relationship), EU regulators generally consider it more-or-less impossible to obtain a valid consent.

Data export ‘solutions’

By contrast, model clauses and BCR are data export solutions, enabled by Art 26(2) of the Directive.  These solutions are designed to enable data transfers to non-EEA countries by virtue of ensuring “adequate safeguards with respect to the protection of the privacy and fundamental rights and freedoms of individuals”.

What this means is that, unlike consent, they are not simply legal exemptions that effectively disapply the ‘adequacy’ protection the law otherwise would provide individuals.  Rather, they work with this adequacy requirement by introducing controls into the relationship between the transferor and recipient to ensure that the data remains protected even when it is outside the EEA.

For this reason, data export solutions, like model clauses and BCR, should always be preferred over Art 26(1) legal exemptions, like consent.  Art 26(1) exemptions are meant to be applied only when model clauses and BCR are genuinely inappropriate or unavailable for the particular transfer at hand.

When you stop and think about this, it all makes sense.  For very obvious reasons, data protection regulators will always favor solutions that provide an ongoing protection for individuals’ data outside the EEA, as opposed to reliance on consent where the individual is effectively waiving ‘adequate’ protection.  That makes model clauses and BCR a much more robust basis for conducting data exports.

What the regulators say

You don’t have to take my word for it though.  Here’s what the Article 29 Working Party have to say on the issue in their “Working document on a common interpretation of Article 26(1) of Directive 95/46/EC of 24 October 1995” (available here):

Although the use of the derogations per se does not imply in all cases that the country of destination does not ensure an adequate level of protection, it does not ensure that it does either. As a consequence, for an individual whose data have been transferred, even if he has consented to the transfer, this might imply a total lack of protection in the recipient country, at least in the sense of the provisions of Article 25 or 26(2) of directive 95/46…

Furthermore, in the light of experience, the Working Party suggests that consent is unlikely to provide an adequate long-term framework for data controllers in cases of repeated or even structural transfers for the processing in question.Relying on consent may therefore prove to be a “false good solution”, simple at first glance but in reality complex and cumbersome…

If the level of protection in the third country is not adequate in the light of all the circumstances surrounding a data transfer, the data controller should consider Article 26(2), i.e., providing adequate safeguards through, for example, the standard contractual clauses or binding corporate rules. Only if this is truly not practical and/or feasible, then the data controller should consider using the derogations of Article 26(1). 

Further in line with this logic, the Working Party would recommend that the derogations of Article 26(1) of the Directive should preferably be applied to cases in which it would be genuinely inappropriate, maybe even impossible for the transfer to take place on the basis of Article 26(2). 

The Working Party would find it regrettable that a multinational company or a public authority would plan to make significant transfers of data to a third country without providing an appropriate framework for the transfer, when it has the practical means of providing such protection (e.g. a contract, BCR, a convention). 

It is in particular for this reason that the Working Party would recommend that transfers of personal data which might be qualified as repeated, mass or structural should, where possible, and precisely because of these characteristics of importance, be carried out within a specific legal framework (i.e. contracts or binding corporate rules).

And here’s the European Commission echoing that sentiment in their ‘Frequently Asked Questions Relating to Transfers of Personal Data from the EU/EEA to Third Countries (available here):

…the derogations in Article 26(1) of the Directive should be interpreted restrictively and preferably be applied to cases in which it would be genuinely inappropriate, or even impossible, for the transfer to take place on the basis of Article 26(2), i.e., providing adequate safeguards through, for example, (the standard) contractual clauses or binding corporate rules. Only if this is truly not practical and/or feasible should the data controller consider using the derogations in Article 26(1). 

This is the case in particular for transfers of personal data that might be described as repeated, mass or structural. These transfers should, where possible, and precisely because of their importance, be carried out within a specific legal framework (Article 25 or 26(2)). Only for instance when recourse to such a legal framework is impossible in practice can these mass or repeated transfers be legitimately carried out on the basis of Article 26(1).” (emphasis added)

Why this matters

Since the fall of Safe Harbor, many organizations have been scrambling to new data export models to legitimize data transfers from the EU to the US, in particular.  Often consent is one of the first things they consider, under an understandable but ultimately mistaken belief that building some consent language into the privacy policy will fix the problem.  It doesn’t.

Ultimately, what this means for your organization is that, if your post-Safe Harbor data export strategy is built around reliance on consent, then you need to take a long, hard look at whether this really is an appropriate mechanism.  Often, it won’t be, in which case now is the time to take stock and consider moving to a strategy built around model clauses or BCR instead.

Getting to know the General Data Protection Regulation, Part 6 – Designing for compliance

Posted on January 5th, 2016 by


One of the changes due to be implemented under the new General Data Protection Regulation (“GDPR”) is the explicit recognition of the concepts of ‘privacy by design’ and ‘privacy by default’.  Businesses will now find themselves subject to a specific obligation to consider data privacy at the initial design stages of a project as well as throughout the lifecycle of the relevant data processing.

What does the law require today? 

The current EU Data Protection Directive (the “Directive”) has no concept of ‘privacy by design’ or ‘privacy by default’, nor is there an explicit obligation that states that privacy should be a paramount consideration at the design stage of any project. However, the Directive imposes an obligation on the data controller to implement appropriate technical and organisational measures to protect personal data against unlawful processing. By imposing a specific ‘privacy by design’ requirement, the GDPR expands the requirement to implement appropriate technical and organisational measures to ensure that privacy and the protection of data is no longer an after-thought.

Since we first saw the draft of the GDPR in 2012, ‘privacy by design’ has been the subject of discussions by many regulators in order to ensure that the concept achieves the desired effectiveness. For example, the UK’s ICO has already issued guidance on ‘privacy by design’ and encourages organisations to ensure that privacy and data protection is a key consideration in the early stages of any project, and then throughout its lifecycle.

What will the General Data Protection Regulation require?

While the concept of ‘privacy by design’ already exists, it has now been given specific recognition, and is linked to enforcement. Under the proposed ‘privacy by design’ requirement, companies will need to design compliant policies, procedures and systems at the outset of any product or process development.

When implementing appropriate technical and organisation measures in this context, regard should be given to the state of the art and the cost of implementation. In the near agreed unofficial final drafts of the GDPR (referenced by our colleagues Phil Lee and Hazel Grant here), it looks as if the risk-based approach favoured by the Council has won the day. In deciding what measures are appropriate, businesses may also take account of the nature, scope, context and purposes of the processing as well as the risks of varying likelihood and severity for the rights and freedoms of individuals. This approach will mean that businesses will have greater flexibility to determine how compliance looks in practice.

In making such determination, businesses may need to consider matters such as whether a system which processes the personal data of customers/employees would, for example:

  • allow personal data to be collated with ease in order to comply with subject access requests;
  • allow suppression of data of customers who have objected to receiving direct marketing; or
  • allow the data controller to satisfy the data portability requirements of the GDPR.

Data controllers should also consider whether the relevant personal data can be pseudonymised – the latest unofficial draft of the GDPR makes specific reference to pseudonymisation as one example of a measure that is designed to integrate the necessary safeguards into the processing of personal data.

The GDPR also introduces a specific ‘privacy by default’ obligation. ‘Privacy by default’ requires that controllers implement appropriate technical and organisational measures to ensure that, by default, only personal data which are necessary for each specific purpose of the processing are processed. According to the latest unofficial draft of the GDPR, the effect of this requirement is that data controllers should minimise the amount of the data collected, the extent of their processing, the period of their storage and their accessibility.   The bottom line is that, by default, businesses should only process personal data to the extent necessary for their intended purposes and should not store it for longer than is necessary for these purposes. In particular, the data controller should ensure that, by default, personal data are not made available without the individual’s intervention to an indefinite number of people.

While the current Directive contains requirements in relation to ensuring that excessive personal data is not processed/retaining it only for as long as is necessary, the GDPR contains an explicit obligation to implement appropriate technical and organisational measures designed to meet these requirements.

What are the practical implications?

The explicit mention in the GDPR of the requirements of ‘privacy by design’ and ‘privacy by default’ will mean that businesses must implement internal processes and procedures to address these requirements.   Some practical steps that may be advisable include:

  • implementing a privacy impact assessment template that the business can populate each time it designs, procures or implements a new system;
  • revising standard contracts with data processors to set out how risk/liability will be apportioned between the parties in relation to the implementation of ‘privacy by design’ and ‘privacy by default’ requirements;
  • revisiting data collection forms/web-pages to ensure that excessive data is not collected;
  • having automated deletion processes for particular personal data, implementing technical measures to ensure that personal data is flagged for deletion after a particular period etc.

The Directive is Dead (Almost)! Long live the GDPR!

Posted on December 15th, 2015 by

It’s here: after years and years of debate, the negotiating parties to the trilogue are reported finally to have agreed the text of the European Union’s successor privacy legislation: the General Data Protection Regulation.  Jan Albrecht, the German MEP leading up the European Parliament’s negotiations on the GDPR, even tweeted this picture of the negotiators who struck today’s deal – somehow a fitting use of social media technology, given that the key driver behind this legislative change is to bring Europe’s aging data privacy rules up to date for the modern technological era.

This isn’t the formal end of the legislative process though – while the text of the GDPR has been agreed by the trilogue negotiation parties (and if you’re wondering what a trilogue is, see my colleague Olivier’s post here), it still has yet to be formally adopted by the European Parliament and Council.  This is very likely to be a rubber-stamping process taking place early in 2016 – only then will the GDPR actually become law.  When it does, the countdown clock will begin ticking down to the date when the GDPR comes fully into effect – two years after its adoption (so 2018).

The agreed text has not yet been made publicly available, even though near final drafts of it have been leaked.  Rest assured, Fieldfisher’s Privacy, Security and Information team will be reporting as and when it is, and in the meantime, you can find excellent analyses of the changes being brought in by the GDPR in our “Getting to know the GDPR” blog series posted on this blog –  in particular:

1.  Getting to know the GDPR, Part 1 – You may be processing more personal information than you think

2.  Getting to know the GDPR, Part 2 – Out-of-scope today, in scope in the future. What is caught?

3.  Getting to know the General Data Protection Regulation, Part 3 – If you receive personal data from a third party, you may need to “re-think” your legal justification for processing it

4.  Getting to know the GDPR, Part 4 – “Souped-up” individual rights.

5.  Getting to know the GDPR, Part 5: Your big data analytics and profiling activities may be seriously curtailed

In a nutshell, what can you expect?  Well, the GDPR will usher in an era of greater accountability, with significantly increased transparency and controls for individuals to exercise management of their data.  It will have a global effect, so that any business that collects and uses data from European citizens – whether established in the EU or not – will potentially find itself subject to EU data protection rules.

It will apply both to “controllers” and to “processors”, meaning service provider businesses (think the B2B cloud) that previously had not been directly subject to EU data protection compliance requirements will find themselves caught by the new rules.  And, of course, there is the headline grabbing news that non-compliant businesses risk fines of up to 4% of global turnover.

Finally, there is the good news that the patchwork quilt of 28 different EU Member States’ laws, all with their own quirks and kinks, will be replaced by a single, unifying data protection law, leading (hopefully) to significantly greater data protection harmonization throughout the EU – a “win, win” for consumers and businesses alike.  Data protection authorities must live up to this challenge of harmonization through the mechanics of the GDPR’s ‘one stop shop’ and consistency mechanism.

What a journey!  While there have been skeptics along the way (and I count myself among them), there’s no denying that this is an achievement of simply epic proportions and one that will define the future of Europe’s Digital Single Market, of data protection, and of our identities and rights as individuals, for decades to come.

For more information, see this European Commission press release here.

EU proposes new consumer rights for the return of data exchanged for digital content

Posted on December 10th, 2015 by

We’ve previously commented in some depth on the EU’s Digital Single Market proposals, most of which are currently out to consultation. The European Commission today set out new plans for two proposals under this DSM strategy to better protect consumers who shop online across the EU and help businesses expand their online sales. There’s more detail on the ecommerce issues at our sister Tech Blog.

The online context

In a nutshell, the EU is concerned that EU based online consumers enjoy a variety of different online rights country to country and this significantly complicates compliance for eVendors. This creates real difficulties for any eVendor looking to address all EU markets with their services. In particular, there are no consistent consumer rights around the supply of “digital content” (a term not even recognized in the laws of some Member States).

The EU proposal for a Digital Content Directive

One of today’s proposals from the European Commission included a draft for a new Directive on the supply of digital content (e.g. streaming music, online games, apps or e-books (see text here)) (the “draft Directive“). We’re told the “proposals will tackle the main obstacles to cross-border e-commerce in the EU: legal fragmentation in the area of consumer contract law and resulting high costs for businesses – especially SMEs- and low consumer trust when buying online from another country.”

But what’s this got to do with “data”?

“That’s all ecommerce” – “this is a data privacy blog” you say. Well, in today’s digital economy, information about individuals is often as valuable as money. Digital content is often supplied in exchange for the consumer giving access to personal or other data. In this draft Directive this is somewhat clumsily termed “use of the counter-performance other than money“. With this in mind, and with the desire to treat the exchange of data in the same way as the exchange of money, Articles 12 to 16 of the draft Directive address consumer rights in digital content contracts established in exchange for data.

An eVendor must cease data use upon contract termination

Importantly, under the draft Directive proposals, if an EU consumer has obtained digital content or a digital service, in exchange for data or personal data, the new rules clarify that the eVendor should stop using that data in case the contract is ended. What’s more, the eVendor should return it!

In the cases of a lack of conformity with the contract, the consumer shall be entitled to have digital content they’ve “purchased” (or participated in the “use of the counter-performance other than money”!) brought into conformity with the contract free of charge. If this can’t be done (and subject to some other provisions I’ll spare you from here), the consumer may be either entitled to a proportionate reduction in price or to terminate the contract – Article 12.

There are similar proposals in the event termination rights are exercised in respect of digital content provided and then modified by the eVendor over a period of time. If a subsequent modification adversely impacts the access to, or use of the content, then the consumer has a termination right in certain prescribed circumstances – Article 15.

There are also similar termination rights proposed in respect of long-term contracts (lasting more than 12 months) – Article 16.

When a contract for digital content terminates

What’s more, in any of the above circumstances, where the consumer terminates the contract for digital content that has been entered into in exchange for data instead of money:

  • The eVendor “shall take all measures which could be expected” and cease use of (1) any data which the consumer has provided in exchange for the digital content and; (2) any other data collected by the eVendor in relation to the supply of the digital content (including any content provided by the consumer but with the exception of the content which has been generated jointly by the consumer and others who continue to make use of the content); and
  • The eVendor shall provide the consumer with technical means to “retrieve all content provided by the consumer and any other data produced or generated through the consumer’s use of the digital content to the extent that data has been retained by the eVendor“. What’s more the consumer “shall be entitled to retrieve the content free of charge, without significant inconvenience, in reasonable time and in a commonly used data format unless this is impossible, disproportionate or unlawful“.

There is no distinction between personal data and data so the proposed rules are quite pervasive. The Recitals to the draft Directive state “[f]ulfilling the obligation to refrain from using data should mean in the case when the counter-performance consists of personal data, that the supplier should take all measures in order to comply with data protection rules by deleting it or rendering it anonymous in such a way that the consumer cannot be identified by any means likely reasonably to be used either by the supplier or by any other person.” This reads as a positive obligation to delete and not purely a reactionary step should the consumer request it.

This is HUGE! For any eVendor, isolating and stopping the use of discrete data sets relating to an individual consumer is hard enough. Designing and perfecting a mechanism to trace and then return any and all data sets specific to a customer is something else. This is a data identification and portability conundrum of extreme proportions. As above, the draft expressly applies to any data (and not just personal data).

In context, say I download a free eBook in return for my personal details and perhaps the completion of an online survey. That book reads well, but at chapter 7, I can no longer advance the pages and the eVendor cannot cure this despite my demands. As a consumer, I’ll have a right to terminate. At that point the eVendor of the book must stop using my details, cease using the data from my survey. Additionally, all that data must be identified and returned! Thankfully, the eVendor would not have to identify and cease to use certain meta-data relating to the how fast, when and on which devices I read the eBook (see below as it seems that’s out of the draft Directive’s scope). If I’m honest, for a free eBook, I’m not sure I care about the return of my data (but an Austrian student with a good legal background and time on his or her hands will!).

When would the rules apply?

This is a first draft proposal and will undoubtedly be subject to intense lobbying and debate in the coming months. Even once passed, as a directive, it would take up to 24 months to incorporate the rules into the local law of Member States.

The accompanying impact assessment stressed that in particular the draft Directive should cover services which allow the creation, processing or storage of data. “While there are numerous ways for digital content to be supplied, such as transmission on a durable medium, downloading by consumers on their devices, web-streaming, allowing access to storage capabilities of digital content or access to the use of social media, this Directive should apply to all digital content independently of the medium used for its transmission“. The Directive does not cover services performed with a significant element of human intervention or contracts governing specific sectorial services such as healthcare, gambling or financial services.

For now, the draft Directive should apply only to contracts where the eVendor “requests and the consumer actively provides data, such as name and e-mail address or photos, directly or indirectly to the supplier for example through individual registration or on the basis of a contract which allows access to consumers’ photos“.

This Directive should not apply to situations where:

  • the eVendor “collects data necessary for the digital content to function in conformity with the contract, for example geographical location where necessary for a mobile application to function properly, or for the sole purpose of meeting legal requirements, for instance where the registration of the consumer is required for security and identification purposes by applicable laws”; and
  • data collected is “strictly necessary for the performance of the contract or for meeting legal requirements and the supplier does not further process them in a way incompatible with this purpose“;
  • the eVendor collects information, “including personal data, such as the IP address, or other automatically generated information such as information collected and transmitted by a cookie, without the consumer actively supplying it, even if the consumer accepts the cookie“; and
  • the consumer is “exposed to advertisements exclusively in order to gain access to digital content“.

What about other privacy rules (and presumably the GDPR)?

Article 3 of the draft Directive clarifies that in case of conflict between the Directive and another EU act, the other EU act takes precedence. In particular, it clarifies that the Directive is without prejudice to the rules on data protection.

In terms of general proposed scope, the draft Directive “covers the supply of all types of digital content“. It also covers “digital content supplied not only for a monetary payment but also in exchange for (personal and other) data provided by consumers, except where the data have been collected for the sole purpose of meeting legal requirements“.

You thought you had enough new law to deal with.

Mark Webber – Partner, Silicon Valley California

Getting to know the GDPR, Part 5: Your big data analytics and profiling activities may be seriously curtailed

Posted on December 4th, 2015 by

What does the law require today? 

Currently, there is no legal definition of ‘profiling’ under European data protection law. The Directive 95/46/EC refers to ‘automated individual decisions’ without explicitly mentioning the word ‘profiling’.

Article 15 of the Directive grants “the right to every person not to be subject to a decision which produces legal effects concerning him or significantly affects him and which is based solely on automated processing of data intended to evaluate certain personal aspects relating to him, such as his performance at work, creditworthiness, reliability, conduct, etc.”., unless such decision is:

– taken in the course of entering into or performance of a contract; or

– authorized by a law.

What will the General Data Protection Regulation (GDPR) require?

Geographical scope of the GDPR:

The scope of the GDRP is broader than the current Directive 95/46/EC because it will apply not only to controllers who are established in the EU, but also to controllers who are not established in the EU “where the processing activities are related to (…) the monitoring of their behaviour as far as their behaviour takes place within the European Union.” [Emphasis added] (Article 3 of the GDPR).

Recital 21 of the GDPR explains that “in order to determine whether a processing activity can be considered to ‘monitor the behaviour’ of data subjects, it should be ascertained whether individuals are tracked on the internet with data processing techniques which consist of profiling an individual, particularly in order to take decisions concerning her or him or for analysing or predicting her or his personal preferences, behaviours and attitudes.”

As a result, companies that are based outside the EU but are nonetheless processing personal data about EU residents in the context of profiling activities, will be subject to the GDPR, and consequently, will have to comply with the rules on automated decision-making. This will have the effect of levelling the scope of the GDPR to most companies that carry out marketing activities in Europe, regardless of whether they are established within or outside Europe.

Material scope of the GDPR:

Under the GDPR, ‘profiling’ is defined as “… any form of automated processing of personal data consisting of using those data to evaluate personal aspects relating to a natural person, in particular to analyse and predict aspects concerning performance at work, economic situation, health, personal preferences, or interests, reliability or behaviour, location or movements” (Section 12(a) of Article 4 of the Council’s version).

‘Profiling’ is therefore composed of three elements:

  • it has to be an automated form of processing;
  • it has to be carried out on personal data; and
  • the purpose of the profiling must be to evaluate personal aspects about a natural person.

There is no general prohibition of ‘profiling’ activities under the GDPR. In its current reading, Article 20 of the GDPR states in similar wording to article 15 of the Directive 95/46/EC:

The data subject shall have the right not to be subject to a decision (…) based solely on automated processing, including profiling, which produces legal effects concerning him or her or significantly affects him or her.”

The GDPR therefore sets out three criteria which may trigger the restrictions on automated processing of personal data (of which profiling is a part of), namely:

  • a decision has to be made about an individual;
  • which has a legal effect for that individual or significantly affects him or her; and
  • this decision must be based solely on automated processing.

If those three criteria are met, then such automated processing would normally be prohibited, unless one of the following conditions applies:

  • A law or regulation within a Member State to which the controller is subject authorizes the profiling activity; or
  • The profiling activity is necessary for the purpose of entering into (or performing) a contract with the individual concerned; or
  • The individual concerned has given his/her explicit consent to use his/her personal data for profiling purposes.

Where the profiling is based on a contractual relationship with the data subject or the data subject’s explicit consent, the controller must implement “suitable measures” to safeguard the rights of the individuals. In particular, the controller must allow for a human intervention and the right for individuals to express their point of view, to obtain further information about the decision that has been reached on the basis of this automated processing, and the right to contest this decision. Data controllers must also inform individuals specifically about “the existence of automated decision making including profiling and information concerning the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject” (Article 14(a) of the GDPR).

Finally, the GDPR prohibits explicitly the use of an individual’s sensitive personal data for profiling purposes, unless:

  • that individual has given his/her explicit consent (except where a law provides that such prohibition cannot be lifted by the individual’s consent); or
  • such profiling is necessary for reasons of public interest.

What are the practical implications?

Article 20 of the GDPR lays out similar restrictions on automated decision-making to those that currently exist under article 15 of the Directive 95/46/EC. However, the GDPR does make several important changes, including:

  • a specific definition for the term ‘profiling’;
  • explicit consent as a new legal basis for profiling activities;
  • a prohibition to profile individuals based on their sensitive data (unless explicit consent is obtained); and
  • an obligation to inform the data subjects specifically about any profiling activities.

Companies will therefore have to assess the lawfulness of their profiling activities in order to determine whether their intended profiling activities produce any legal effects or significantly affect the individuals concerned in order to determine on what legal basis they may carry out their profiling activities (i.e., a law, an existing contract with the data subject or the data subject’s prior consent). Regrettably, the GDPR does not explain what constitutes a “legal effect” or “significantly affects” the individual, and therefore, we are likely to see variations in interpretation of these legal concepts by the national data protection authorities and the courts.

Once companies have assessed that their profiling activities are lawful, they must ensure that they have implemented appropriate measures to guarantee that individuals can exercise their rights (for example, the right not to be subject to profiling activities) and to ensure that their profiling activities remain within the remit of the law. This can be done by using certain techniques, such as data minimisation and pseudonymisation, to minimize the risk of affecting the privacy of individuals, and by carrying out privacy impact assessments prior to conducting their profiling activities, particularly if there is a risk of discrimination, identity theft or fraud, financial loss, damage to reputation, or other adverse effects for individuals.


By Olivier Proust, Of Counsel (

Getting to know the GDPR, Part 4 – “Souped-up” individual rights.

Posted on November 27th, 2015 by

A key area of proposed change under the General data Protection Regulation (“GDPR“) relates to individual rights. The proposal is both to refresh individuals’ existing rights, by clarifying and extending them, and to introduce new rights. Most notably, the GDPR creates two new rights: the (in)famous “right to be forgotten” and the right to data portability.

What does the law require today? 

Currently, individuals have the following rights under the Data Protection Directive:

  • The right to be provided with fair processing information – this right requires the data controller to provide individuals with certain minimum information regarding the processing of their personal data;
  • The right of access – this right permits individuals to query the data controller as to whether personal data related to them are being processed. Upon request, the data controller must also provide a copy of any such personal data. This copy must be provided without excessive delay and may be subject to payment of a small fee;
  • The right to object to the processing of their data – this right applies in certain limited circumstances prescribed by the Data Protection Directive;
  • The right to rectification, erasure or blocking of data – this right can only be exercised when the processing is not in compliance with the Data Protection Directive;
  • The right not to be subjected to solely automated processes – this right applies where such processes evaluate the individual’s personal attributes, resulting in a decision that significantly affects him or has legal consequences for him.

What will the General Data Protection Regulation require?

Proposed extension of existing rights

Most proposed modifications to the existing rights bring clarity without extending them too much.

  • The right to be provided with fair processing information will be expanded. The bottom line is that the data controller will need to provide more detailed information, such as the source of the data and the retention period. In addition, the GDPR requires this information to be provided in an intelligible form, using clear and plain language that is adapted for the individual. The practical effect of this requirement is that policies will need to be drafted differently depending on whether they are aimed at children or adults.
  • Regarding the right of access, under the GDPR proposals, data controllers will be required to provide additional information to individuals (e.g. storage period of the data). Further, the proposed new requirements are somewhat more burdensome for businesses – in particular, businesses will need to set up a specific process in order to deal with access requests. Further, unless the request is “manifestly excessive“, data controllers will in principle be obliged to provide the information free of charge.
  • The rectification right is mostly the same and the changes will have very limited practical impact.
  • More significantly, the right to object is now broader as, when the processing is based on the legitimate interests of the controller or is undertaken for direct marketing purposes, the individual can object without having to provide specific justifications.

Proposed new rights

A controversial right from the start, the proposed right to be forgotten was influenced by the CJEU’s decision in the Costeja v. Google case. Following this ruling, the Parliament proposed renaming it the “right to erasure” and the Council has proposed dropping the obligation on the data controller to ensure third parties also erase the data. It therefore remains to be seen what form this right will take in the finalised GDPR.

  • It is likely though that this right would apply in one of the following scenarios:
  • The data are no longer needed for the original purpose;
  • The data subject has withdrawn his/her consent and there are no other grounds for the processing of the data;
  • The data subject has objected to the processing;
  • A court order requiring the erasure of the data has been issued;
  • The processing is unlawful.

Another proposed new right is the right of data portability. This right was created in order to improve the interoperability of data processing. The proposal of the Commission puts a heavy burden on the data controller as it imposes a requirement to provide personal data to the data subject in a commonly used format. The rationale behind the proposal is to facilitate the ease of transfer of personal data from one data controller to another.

The Parliament suggests that data portability should not be a right but rather that data controllers should be encouraged to promote interoperability, whereas the Council is of the view that this right should apply only to cases where the data subject has transmitted the relevant personal data to the data controller.

As things stand, we will have to wait to see what form this right will take or whether it will be scrapped in favour of some form of encouragement for data controllers to provide data in a commonly used format.

As regards other new rights, both the European Parliament and Council have proposed a definition of profiling as a form of automated processing. One key departure from the Data Protection Directive vis-à-vis such automated processing is that explicit consent is likely to be required for profiling which produces a legal effect or significantly affects an individual. This topic will be discussed in further detail in our next blog on the GDPR.

What are the practical implications?

  • All businesses will have to update and revamp their privacy policies and data protection notices to make sure that the extended rights are properly addressed. Businesses should check that the data protection notices that they provide to individuals contain all the required information.
  • Businesses will need to assess whether they should put in place new or updated processes and procedures to deal with the practical implications of the extended rights, e.g. a specific data procedure for dealing with access requests.
  • Finally, the right to be forgotten (and the right of portability) may require changes to companies’ operational processes and IT systems, depending on what these rights will look like in their final form.


Can you amend EU model clauses?

Posted on November 17th, 2015 by

Since the fall of Safe Harbor, there’s been a wave of data export conservatism that’s spread across Europe – ranging from EU data protection authorities casting doubt on the longevity of other data export solutions, through to EU customers delaying (or, in some cases, even cancelling) deals with US counter-parties over data export concerns.

Reports that Safe Harbor 2.0 may be on its way have done little to allay these woes because, whatever the optimism of the political parties involved in these discussions, the fact remains that any new framework adopted will face significant adoption challenges.  For a start, existing Safe Harbor companies will almost certainly need to re-certify under the new framework (possibly with greater checks and balances by way of third party audit), certain DPAs around the EU will remain highly skeptical of – and so likely inclined to investigate – any transfers made under revised US-EU Safe Harbor arrangements, and many EU customers who have been ‘once bitten, twice shy’ due to the current Safe Harbor’s collapse will be reluctant to move away from solutions they see as being more ‘tried and trusted’, i.e. model clauses.

So, rightly or wrongly, that means for the short- to mid-term model clauses are likely to remain the solution of choice for many companies engaging in global data exports, whether intra-group or to US (or wider international) suppliers.  Certainly, this has been my personal experience to date – virtually every EU-US deal I’ve been engaged on in recent weeks has been dominated by discussions concerning the need for model clauses.

The problem with model clauses

While they are probably the only immediately viable legal solution for data exports right now, it’s no secret that model clauses – especially the 2010 controller-to-processor model clauses – suffer from significant problems – namely, the potential for on-premise audits, consent and contractual flow down requirements when appointing sub processors, and an absence of liability limitation provisions.  In a one-off arrangement between just two parties, these obstacles might be surmountable and a commercially-acceptable risk; in a cloud-based environment where the supplier hosts its solution on third party infrastructure with vendors who won’t negotiate their terms and where it provides a multi-tenanted, uniform offering across all customers, it’s a very significant problem.  The accrued risk is potentially huge.

Simple example: imagine a US supplier has 5,000 EU customers, and at any one time 1% of those decide to exercise on-premise audit rights under the 2010 model clauses (e.g. in the wake of a data incident).  Suddenly, the supplier finds itself managing 50 simultaneous on-premise audits, a significant business disruption and threat to the security of the data it hosts.  Or, imagine instead, that 10% of its EU customers insist on case-by-case consents every time the supplier wishes to appoint a new sub processor (which may be something as simple as another group company providing technical support to EU customers) – this means approaching 500 customers for consent.  What if one (or more) refuse?

So can you amend the model clauses?

Bearing the above in mind, it should be no surprise that suppliers, when asked to sign model clauses, will often seek to amend their more onerous provisions, either by way of a side agreement or directly within the model clauses themselves.  But, when they do, they’re often met with a very blunt response: “You can’t amend the model clauses!

Having encountered this argument many times when negotiating on behalf of internationally-based suppliers, I want to set the record straight on this point.  You absolutely can amend the model clauses, provided your terms are purely commercial in nature and do not impact the protection of the data, nor the rights of data subjects or supervisory authorities.

If you’re not convinced you can amend the model clauses, then see Clause 10 of the 2010 Controller-to-Processor Model Clauses: “The parties undertake not to vary or modify the Clauses. This does not preclude the parties from adding clauses on business related issues where required as long as they do not contradict the Clause.” (emphasis added).  In fact, as if to emphasize the point, the 2010 Model Clause even include an “illustrative” and “optional” indemnification clause.

Similar language exists in the 2004 Controller-to-Controller Model Clauses too at Clause VII: “The parties may not modify these clauses except to update any information in Annex B, in which case they will inform the authority where required. This does not preclude the parties from adding additional commercial clauses where required.” (emphasis added).  (In the interests of completeness, the original 2001 Controller-To-Controller Model Clauses do not expressly permit the addition of commercial clauses, which is as good a reason as any to avoid using them.)

And, if that weren’t enough, even the Article 29 Working Party has weighed in on this issue with its FAQs on the 2010 Model Clauses: “7) Is it possible to add commercial clauses to the Model Clauses?  As clearly stated in clause 10, parties must not vary or modify the Model Clauses, but this shall not prevent the parties from adding clauses on business-related issues where required, as long as they do not contradict the Model Clauses.

Should you amend the model clauses?

First things first, if you want to amend the model clauses, it’s very important you do so in a considered way that is respectful of the rights the model clauses aim to protect.  Don’t go doing things like removing third party beneficiary rights owed to data subjects or flat out refusing audit rights – that cuts right to the heart of the protections that the model clauses are intended to provide and will never ever be acceptable, either to counter-parties or to supervisory authorities.

Any amendments you make should be purely commercial in nature, or intended to explain how some of the model clause rights should work in practice.  For example, you might choose to limit the liability between the two parties to the model clauses (but not the data subjects!) by reference to liability caps agreed within a master services agreement between the parties.  Alternatively, you might seek a general, upfront consent from the EU data exporter to the data importer’s appointment of sub processors, provided the appointed sub processors fulfill the requirements of the model clauses.  Or you might seek to explain how the EU data exporter can exercise its model clause audit rights against the data importer in practice – for example, through reliance on the data importer’s independent third party audit certifications or written responses to audit questionnaires etc.

As a final consideration, if you do amend model clauses, be aware that this may trigger regulatory notification or authorization requirements in some Member States.  This doesn’t mean that you can’t amend the model clauses, but is a consideration that should be investigated and borne in mind if amending the model clauses.

When doing so, ask yourself this question: Is it better to sign model clauses that you know you (or your supplier) will be unable to comply with for legitimate practical reasons, simply to ease any regulatory notification requirements?  Or is contractual honesty between two parties, knowing that they will comply in full with the terms they agree, the better approach, even if this may carry some additional regulatory requirements?

Time for US businesses to consider an anti-surveillance pledge?

Posted on October 23rd, 2015 by

Breakdown of trust is a terrible thing that often has negative and unpredictable consequences, not just for those directly involved but also for those inadvertently caught up in the ensuing fall-out: for the friends who are forced to choose sides when a relationship breaks up, for the children affected when a marriage breaks down and, yes, for the businesses harmed when transatlantic trust between two great economic regions falls apart.

Because, when all is said and done, the recent collapse of Safe Harbor is ultimately attributable to a breakdown in trust.  Whatever legal arguments there are about data export “adequacy”, Europe has fundamentally lost trust in the safe handling of European citizens’ data Stateside.  The resulting panic was inevitable – international conglomerates worry about their regulatory compliance, US supply-side businesses realize that there is now no effective legal solution for their lawful handling of data, and regulators move to calm in threatening tones that they will not take enforcement – for the time being.

Which leaves us all in a quandary.  Businesses must by necessity start putting in place a patchwork of legal solutions designed, if not to achieve compliance, then at least to manage risk, but many of these solutions will not be officially recognized either by law or the regulatory community (how exactly should US processors lawfully onward transfer data to sub-processors?).  Consequently, these solutions – while necessary in an environment where no alternatives exist – will likely fuel further legislative and regulatory speculation that companies are working around data protection rules, rather than with them.

But when compliance becomes impossible, so everyone becomes a criminal.  Think of it this way:  if you tax me at 40%, I will pay.  But tax me at 90% and I simply can’t afford to, so won’t – no matter how much I may believe in the principle of taxation or want to be a law-abiding member of society.

An anti-surveillance pledge to restore trust

So where does that leave us?  The real dialogue to have here is one around restoring trust.  This is absolutely critical.  And that is why all businesses – especially US businesses right now – must consider taking an anti-surveillance pledge.

What does an anti-surveillance pledge look like?  It takes the form of a short statement, perhaps no more than two or three paragraphs in length, under which the business would pledge never knowingly to disclose individuals’ data to government or law enforcement authorities unless either (1) legally compelled to do so (for example, by way of a warrant or court order), or (2) there is a risk of serious and imminent harm were disclosure to be withheld (for example, imminent terrorist threat).  The pledge would be signed by senior management of the business, and made publicly-available as an externally-facing commitment to resist unlawful government-led surveillance activities – for example, by posting on a website or incorporation within an accessible privacy policy.

Will taking a pledge like this solve the EU-US data export crisis?  No.  Will it prevent government surveillance activities occurring upstream on Internet and telecoms pipes over which the business has no control?  No.  But will it demonstrate a commitment to the world that the business takes its data subjects’ privacy concerns seriously and that it will do what is within its power to do to prevent unlawful surveillance – absolutely: it’s a big step towards accountably showing “adequate” handling of data.

The more businesses that sign a pledge of this nature, the greater the collective strength of these commitments across industries and sectors; and the greater this collective strength, the more this will assist the long, slow process of restoring trust.  Only through the restoration of trust will we see a European legislative and regulatory environment once more willing to embrace the adequacy of data exports to the US.  So, if you haven’t considered it before, consider it now: it’s time for an anti-surveillance pledge.


Getting to know the GDPR, Part 2 – Out-of-scope today, in scope in the future. What is caught?

Posted on October 20th, 2015 by

The GDPR expands the scope of application of EU data protection law requirements in two main respects:

  1. in addition to data “controllers” (i.e. persons who determine why and how personal data are processed), certain requirements will apply for the first time directly to data “processors” (i.e. persons who process personal data on behalf of a data controller); and
  2. by expanding the territorial scope of application of EU data protection law to capture not only the processing of personal data by a controller or a processor established in the EU, but also any processing of personal data of data subjects residing in the EU, where the processing relates to the offering of goods or services to them, or the monitoring of their behaviour.


The practical effect is that many organisations that were to date outside the scope of application of EU data protection law will now be directly subject to its requirements, for instance because they are EU-based processors or non EU-based controllers who target services to EU residents (e.g. through a website) or monitor their behaviour (e.g. through cookies). For such organisations, the GDPR will introduce a cultural change and there will be more distance to cover to get to a compliance-ready status.

What does the law require today?

The Directive

At present, the Data Protection Directive 95/46/EC (“Directive“) generally sets out direct statutory obligations for controllers, but not for processors. Processors are generally only subject to the obligations that the controller imposes on them by contract. By way of example, in a service provision scenario, say a cloud hosting service, the customer will typically be a controller and the service provider will be a processor.

Furthermore, at present the national data protection law of one or more EU Member States applies if:

  1. the processing is carried out in the context of the activities of an establishment of the controller on the territory of the Member State. When the same controller is established on the territory of several Member States, each of these establishments should comply with the obligations laid down by the applicable national law (Article 4(1)(a)); or
  2. the controller is not established on EU territory and, for purposes of processing personal data makes use of equipment situated on the territory of a Member State (unless such equipment is used only for purposes of transit through the EU) (Article 4(1)(c)); or
  3. the controller is not established on the Member State’s territory, but in a place where its national law applies by virtue of international public law (Article 4(1)(b)). Article 4(1)(b) has little practical significance in the commercial and business contexts and is therefore not further examined here. The GDPR sets out a similar rule.


CJEU case law

Two recent judgments of the Court of Justice of the European Union (“CJEU“) have introduced expansive interpretations of the meaning of “in the context of the activities” and “establishment”:

  1. In Google Spain, the CJEU held that “in the context of the activities” does not mean “carried out by”. The data processing activities by Google Inc are “inextricably linked” with Google Spain’s activities concerning the promotion, facilitation and sale of advertising space. Consequently, processing is carried out “in the context of the activities” of a controller’s branch or subsidiary when the latter is (i) intended to promote and sell ad space offered by the controller, and (ii) orientates its activity towards the inhabitants of that Member State.
  2. In Weltimmo, the CJEU held that the definition of “establishment” is flexible and departs from a formalistic approach that an “establishment” exists solely where a company is registered. The specific nature of the economic activities and the provision of services concerned must be taken into account, particularly where services are offered exclusively over the internet. The presence of only one representative, who acts with a sufficient degree of stability (even if the activity is minimal), coupled with websites that are mainly or entirely directed at that EU Member State suffice to trigger the application of that Member State’s law.


What will the GDPR require?

The GDPR will apply to the processing of personal data:

  1. in the context of the activities of an establishment of a controller or a processor in the EU; and
  2. of data subjects residing in the EU by a controller not established in the EU, where the processing activities are related to the offering of goods or services to them, or the monitoring of their behaviour in the EU.

It is irrelevant whether the actual data processing takes place within the EU or not.

As far as the substantive requirements are concerned, compared to the Directive, the GDPR introduces:

  1. new obligations and higher expectations of compliance for controllers, for instance around transparency, consent, accountability, privacy by design, privacy by default, data protection impact assessments, data breach notification, new rights of data subjects, engaging data processors and data processing agreements;
  2. for the first time, direct statutory obligations for processors, for instance around accountability, engaging sub-processors, data security and data breach notification; and
  3. severe sanctions for compliance failures.


What are the practical implications?

Controllers who are established in the EU are already caught by EU data protection law, and will therefore not be materially affected by the broader scope of application of the GDPR. For such controllers, the major change is the new substantive requirements they need to comply with.

Processors (such as technology vendors or other service providers) established in the EU will be subject to the GDPR’s direct statutory obligations for processors, as opposed to just the obligations imposed on them by contract by the controller. Such processors will need to understand their statutory obligations and take the necessary steps to comply. This is a major “cultural” change.

Perhaps the biggest change is that controllers who are not established in the EU but collect and process data on EU residents through websites, cookies and other remote activities are likely to be caught by the scope of the GDPR. E-commerce providers, online behavioural advertising networks, analytics companies that process personal data are all likely to be caught by the scope of application of the GDPR.

We still have at least 2 years before the GDPR comes into force. This may sound like a long time, but given the breadth and depth of change in the substantive requirements, it isn’t really! A lot of fact finding, careful thinking, planning and operational implementation will be required to be GDPR ready in 24 months.

So what should you be doing now?

  1. If you are a controller established in the EU, prepare your plan for transitioning to compliance with the GDPR.
  2. If you are a controller not established in the EU, assess whether your online activities amount to offering goods or services to, or monitoring the behaviour of, EU residents. If so, asses the level of awareness of / readiness for compliance with EU data protection law and create a road map for transitioning to compliance with the GDPR. You may need to appoint a representative in the EU.
  3. Assess whether any of your EU-based group companies act as processors. If so, asses the level of awareness of / readiness for compliance with EU data protection law and create a road map for transitioning to compliance with the GDPR.
  4. If you are a multinational business with EU and non-EU affiliates which will or may be caught by the GDPR, you will also need to consider intra-group relationships, how you position your group companies and how you structure your intra-group data transfers.