Archive for the ‘Accountability’ Category

Subject access requests and data retention: two sides of the same coin?

Posted on October 3rd, 2014 by



Over the past year or so, there’s been a decided upswing in the number of subject access requests made by individuals to organizations that crunch their data.  There are a number of reasons for this, but they’re principally driven by a greater public awareness of privacy rights in a post-Snowden era and following the recent Google “Right to be Forgotten” decision.

If you’re unfamiliar with the term “subject access request”, then in simple terms it’s a right enshrined in EU law for an individual to contact an organization and ask it (a) whether it processes any personal information about the individual in question, and (b) if so, to supply a copy of that information.

A subject access request is a powerful transparency tool for individuals: the recipient organization has to provide the requested information within a time period specified by law, and very few exemptions apply.  However, these requests often prove disproportionately costly and time-consuming for the organizations that receive them – think about how much data your organization holds, and then ask yourself how easy it would be to pull all that data together to respond to these types of requests.  Imagine, for example, all the data held in your CRM databases, customer support records, IT access logs, CCTV footage, HR files, building access records, payroll databases, e-mail systems, third party vendors and so on – picture that, and you get the idea.

In addition, while many subject access requests are driven by a sincere desire for data processing transparency, some (inevitably) are made with legal mischief in mind – for example, the disgruntled former employee who makes a subject access request as part of a fishing expedition to try to find grounds for bringing an unfair dismissal claim, or the representative from a competitor business looking for grounds to complain about the recipient organization’s data compliance.  Because of these risks, organizations are often hesitant about responding to subject access requests in case doing so attracts other, unforeseen and unknown, liabilities.

But, if you’re a data controlling business facing this conundrum, don’t expect any regulatory sympathy.  Regulators can only enforce the law as it exists today, and this expects prompt, comprehensive disclosure.  Not only that, but the fact that subject access requests prove costly and resource intensive to address serves a wider regulatory goal: namely, applying pressure on organizations to reduce the amount of data they hold, consistent with the data protection principle of “data minimization”.

Therefore, considering that data storage costs are becoming cheaper all the time and that, in a world of Big Data, data collection is growing at an exponential rate, subject access becomes one of the most important – if not the most important – tool regulators have for encouraging businesses to minimize the data they retain.  The more data you hold, the more data you have to disclose in response to a subject access request – and the more costly and difficult that is to do.  This, in turn, makes adopting a carefully thought-out data retention policy much more attractive, whatever other business pressures there may be to keep data indefinitely.  Retain data for just a year or two, and there’ll be an awful lot less you need to disclose in response to a subject access request.  At the same time, your organization will enhance its overall data protection compliance.

So what does all this mean?  When considering your strategy for responding to subject access requests, don’t consider it in isolation; think also about how it dovetails with your data retention strategy.  If you’re an in-house counsel or CPO struggling to get business stakeholder buy-in to adopt a comprehensive data retention strategy, use subject access risk as a means of achieving this internal buy-in.  The more robust your data retention policies, the more readily you’ll be able to fulfill subject access requests within the timescales permitted by law and with less effort, reducing complaints and enhancing compliance.  Conversely, with weaker (or non-existent) data retention policies, your exposure will be that much greater.

Subject access and data retention are therefore really just two sides of the same coin – and you wouldn’t base your compliance on just a coin toss, would you?

Processor BCR have a bright future

Posted on July 8th, 2014 by



Last month, the Article 29 Working Party sent a letter to the President of the European Parliament about the future of Binding Corporate Rules for processors (BCR-P) in the context of the EU’s ongoing data privacy legislative reform.

The letter illustrates the clear support that BCR-P have – and will continue to have – from the Working Party.  Whilst perhaps not surprising, given that the Working Party originally “invented” BCR-P in 2012 (having initially invented controller BCR way back in 2003), the letter affirms the importance of BCR-P in today’s global data economy.

“Currently, BCR-P offer a high level of protection for the international transfers of personal data to processors” writes Isabelle Falque-Pierrotin, Chair of the Working Party, before adding that they are “an optimal solution to promote the European principles of personal data abroad.” (emphasis added)

As if that weren’t enough, the letter also issues a thinly-veiled warning to the European Parliament, which has previously expressed skepticism about BCR-P: “denying the possibility for BCR-P will limit the choice of organisations to use model clauses or to apply the Safe Harbor if possible, which do not contain such accountability mechanisms to ensure compliance as it is provided for in BCR-P.

The Working Party’s letter notes that 3 companies have so far achieved BCR-P (and we successfully acted on one of those – see here) with a further 10 applications on the go (and, yes, we’re acting on a few of those too).

Taking the helicopter view, the Working Party’s letter is representative of a growing trend for global organizations to seek BCR approval in preference over other data export solutions: back in 2012, just 19 organizations had secured controller BCR approval; two years later, and today that figure stands at 53 (both controller and processor BCR).

There are several reasons why this is the case:

1.  BCR are getting express legislative recognition:  The Commission’s draft General Data Protection Regulation expressly acknowledges the validity of BCR, including BCR-P, as a valid legal solution to EU’s strict data export rules.  To date, BCR have had only regulatory recognition, and then not consistently across all Member States, casting a slight shadow over their longer term future.  Express legislative recognition ensures the future of BCR – they’re here to stay.

2.  Safe harbor is under increasing strain:  The ongoing US/EU safe harbor reform discussions, while inching towards a slow conclusion, have arguably stained its reputation irreparably.  US service providers that rely on safe harbor to export customer data to the US (and sometimes beyond) find themselves stuck in deal negotiations with customers who refuse to contract with them unless they implement a different data export solution.  Faced with the prospect of endless model clauses or a one-off BCR-P approval, many opt for BCR-P.

3.  BCRs have entered the customer lexicon:  If you’d said the letters “B C R” even a couple of years ago, then outside of the privacy community only a handful of well-educated organizations would have known what you were talking about.  Today, customers are much better informed about BCR and increasingly view BCR as a form of trust mark (which, of course, they are), encouraging the service sector to adopt BCR-P as a competitive measure.

4.  BCRs are simpler than ever before:  Gone are the days when a BCR application took 4 years and involved traveling all over Europe to visit data protection authorities.  Today, a well-planned and executed BCR application can be achieved in a period of 12 – 18 months, all managed through a single lead data protection authority.  The simplification of the BCR approval process has been instrumental in increasing BCR adoption.

So if you’re weighing up the pros and cons of BCR against other data export solutions, then deliberate no longer: BCR, and especially BCR-P, will only grow in importance as the EU’s data export regime gets ever tougher.

 

FTC in largest-ever Safe Harbor enforcement action

Posted on January 22nd, 2014 by



Yesterday, the Federal Trade Commission (“FTC“) announced that it had agreed to settle with 12 US businesses for alleged breaches of the US Safe Harbor framework. The companies involved were from a variety of industries and each handled a large amount of consumer data. But aside from the surprise of the large number of companies involved, what does this announcement really tell us about the state of Safe Harbor?

This latest action suggests that the FTC is ramping up its Safe Harbor enforcement in response to recent criticisms from the European Commission and European Parliament about the integrity of Safe Harbor (see here and here) – particularly given that one of the main criticisms about the framework was its historic lack of rigorous enforcement.

Background to the current enforcement

So what did the companies in question do? The FTC’s complaints allege that the companies involved ‘deceptively claimed they held current certifications under the U.S.-EU Safe Harbor framework‘. Although participation in the framework is voluntary, if you publicise that you are Safe Harbor certified then you must, of course, maintain an up-to-date Safe Harbor registration with the US Department of Commerce and comply with your Safe Harbor commitments 

Key compliance takeaways

In this instance, the FTC alleges that the businesses involved had claimed to be Safe Harbor certified when, in fact, they weren’t. The obvious message here is don’t claim to be Safe Harbor certified if you’re not!  

The slightly more subtle compliance takeaway for businesses who are correctly Safe Harbor certified is that they should have in place processes to ensure:

  • that they keep their self-certifications up-to-date by filing timely annual re-certifications;
  • that their privacy policies accurately reflect the status of their self-certification – and if their certifications lapse, that there are processes to adjust those policies accordingly; and
  • that the business is fully meeting all of its Safe Harbor commitments in practice – there must be actual compliance, not just paper compliance.

The “Bigger Picture” for European data exports

Despite this decisive action by the FTC, European concerns about the integrity of Safe Harbor are likely to persist.  If anything, this latest action may serve only to reinforce concerns that some US businesses are either falsely claiming to be Safe Harbor certified when they are not or are not fully living up to their Safe Harbor commitments. 

The service provider community, and especially cloud businesses, will likely feel this pressure most acutely.  Many customers already perceive Safe Harbor to be “unsafe” for data exports and are insisting that their service providers adopt other EU data export compliance solutions.  So what other solutions are available?

While model contract have the benefit of being a ‘tried and tested’ solution, the suite of contracts required for global data exports is simply unpalatable to many businesses.  The better solution is, of course, Binding Corporate Rules (BCR) – a voluntary set of self-regulatory policies adopted by the businesses that satisfy EU data protection standards and which are submitted to, and authorised by, European DPAs.  Since 2012, service providers have been able to adopt processor BCR, and those that do find that this provides them with a greater degree of flexibility to manage their internal data processing arrangements while, at the same time, continuing to afford a high degree of protection for the data they process.       

It’s unlikely that Safe Harbor will be suspended or disappear – far too many US businesses are dependent upon it for their EU/CH to US data flows.  However, the Safe Harbor regime will likely change in response to EU concerns and, over time, will come under increasing amounts of regulatory and customer pressure.  So better to consider alternative data export solutions now and start planning accordingly rather than find yourself caught short!

 

Global protection through mutual recognition

Posted on July 23rd, 2013 by



At present, there is a visible mismatch between the globalisation of data and the multinational approach to privacy regulation. Data is global by nature as, regulatory limits aside, it runs unconstrained through wired and wireless networks across countries and continents. Put in a more poetic way, a digital torrent of information flows freely in all possible directions every second of the day without regard for borders, geographical distance or indeed legal regimes and cultures. Data legislation on the other hand is typically attached to a particular jurisdiction – normally a country, sometimes a specific territory within a country and occasionally a selected group of countries. As a result, today, there is no such thing as a single global data protection law that follows the data as it makes its way around the world.

However, there is light at the end of the tunnel. Despite the current trend of new laws in different shapes and flavours emerging from all corners of the planet, there is still a tendency amongst legislators to rely on a principles-based approach, even if that translates into extremely prescriptive obligations in some cases – such as Spain’s applicable data security measures depending on the category of data or Germany’s rules to include certain language in contracts for data processing services. Whether it is lack of imagination or testimony to the sharp brains behind the original attempts to regulate privacy, it is possible to spot a common pedigree in most laws, which is even more visible in the case of any international attempts to frame privacy rules.

When analysed in practice and through the filter of distant geographical locations and moments in time, it is definitely possible to appreciate the similarities in the way privacy principles have been implemented by fairly diverse regulatory frameworks. Take ‘openness’ in the context of transparency, for example. The words may be slightly different and in the EU directive, it may not be expressly named as a principle, but it is consistently everywhere – from the 1980 OECD Guidelines to Safe Harbor and the APEC Privacy Framework. The same applies to the idea of data being collected for specified purposes, being accurate, complete and up to date, and people having access to their own data. Seeing the similarities or the differences between all of these international instruments is a matter of mindset. If one looks at the words, they are not exactly the same. If one looks at the intention, it does not take much effort to see how they all relate.

Being a lawyer, I am well aware of the importance of each and every word and its correct interpretation, so this is not an attempt to brush away the nuances of each regime. But in the context of something like data and the protection of all individuals throughout the world to whom the data relates, achieving some global consistency is vital. The most obvious approach to resolving the data globalisation conundrum would be to identify and put in place a set of global standards that apply on a worldwide basis. That is exactly what a number of privacy regulators backed by a few influential thinkers tried to do with the Madrid Resolution on International Standards on the Protection of Personal Data and Privacy of 2009. Unfortunately, the Madrid Resolution never became a truly influential framework. Perhaps it was a little too European. Perhaps the regulators ran out of steam to press on with the document. Perhaps the right policy makers and stakeholders were not involved. Whatever it was, the reality is that today there is no recognised set of global standards that can be referred to as the one to follow.

So until businesses, politicians and regulators manage to crack a truly viable set of global privacy standards, there is still an urgent need to address the privacy issues raised by data globalisation. As always, the answer is dialogue. Dialogue and a sense of common purpose. The USA and the EU in particular have some important work to do in the context of their trade discussions and review of Safe Harbor. First they must both acknowledge the differences and recognise that an area like privacy is full of historical connotations and fears. But most important of all, they must accept that principles-based frameworks can deliver a universal baseline of privacy protection. This means that efforts must be made by all involved to see what Safe Harbor and EU privacy law have in common – not what they lack. It is through those efforts that we will be able to create an environment of mutual recognition of approaches and ultimately, a global mechanism for protecting personal information.

This article was first published in Data Protection Law & Policy in July 2013.

The conflicting realities of data globalisation

Posted on June 17th, 2013 by



The current data globalisation phenomenon is largely due to the close integration of borderless communications with our everyday comings and goings. Global communications are so embedded in the way we go about our lives that we are hardly aware of how far our data is travelling every second that goes by. But data is always on the move and we don’t even need to leave home to be contributing to this. Ordinary technology right at our fingertips is doing the job for us leaving behind an international trail of data – some more public than other.

The Internet is global by definition. Or more accurately, by design. The original idea behind the Internet was to rely on geographically dispersed computers to transmit packets of information that would be correctly assembled at destination. That concept developed very quickly into a borderless network and today we take it for granted that the Internet is unequivocally global. This effect has been maximised by our ability to communicate whilst on the move. Mobile communications have penetrated our lives at an even greater speed and in a more significant way than the Internet itself.

This trend has led visionaries like Google’s Eric Schmidt to affirm that thanks to mobile technology, the amount of digitally connected people will more than triple – going from the current 2 billion to 7 billion people – very soon. That is more than three times the amount of data generated today. Similarly, the global leader in professional networking, LinkedIn, which has just celebrated its 10th anniversary, is banking on mobile communications as one of the pillars for achieving its mission of connecting the world’s professionals.

As a result, everyone is global – every business, every consumer and every citizen. One of the realities of this situation has been exposed by the recent PRISM revelations, which highlight very clearly the global availability of digital communications data. Perversely, the news about the NSA programme is set to have a direct impact on the current and forthcoming legislative restrictions on international data flows, which is precisely one of the factors disrupting the globalisation of data. In fact, PRISM is already being referred to as a key justification for a tight EU data protection framework and strong jurisdictional limitations on data exports, no matter how non-sensical those limitations may otherwise be.

The public policy and regulatory consequences of the PRISM affair for international data flows are pretty predictable. Future ‘adequacy findings’ by the European Commission as well as Safe Harbor will be negatively affected. We can assume that if the European Commission decides to have a go at seeking a re-negotiation of Safe Harbor, this will be cited as a justification. Things will not end there. Both contractual safeguards and binding corporate rules will be expected to address possible conflicts of law involving data requests for law enforcement or national security reasons in a way that no blanket disclosures are allowed. And of course, the derogations from the prohibition on international data transfers will be narrowly interpreted, particularly when they refer to transfers that are necessary on grounds of public interest.

The conflicting realities of data globalisation could not be more striking. On the one hand, every day practice shows that data is geographically neutral and simply flows across global networks to make itself available to those with access to it. On the other, it is going to take a fair amount of convincing to show that any restrictions on international data flows should be both measured and realistic. To address these conflicting realities we must therefore acknowledge the global nature of the web and Internet communications, the borderless fluidity of the mobile ecosystem and our human ability to embrace the most ambitious innovations and make them ordinary. So since we cannot stop the technological evolution of our time and the increasing value of data, perhaps it is time to accept that regulating data flows should not be about putting up barriers but about applying globally recognised safeguards.

This article was first published in Data Protection Law & Policy in June 2013.

It’s time to dust off that privacy policy…

Posted on May 2nd, 2013 by



The Information Commissioner’s Office (“ICO”) has announced in the latest edition of its e-newsletter that it will be examining the privacy policies of 250 of the UK’s most popular websites during the week of 6 – 11 May 2013 as part of ‘Internet Sweep Day’. Each website will be reviewed to check whether it contains an accessible privacy policy in accordance with relevant UK and international laws.

The Internet Sweep Day initiative isn’t limited to just the UK, as the ICO is working in conjunction with other global data protection authorities. The results of the review will be collected and sent back to the Office of the Privacy Commissioner for Canada and a report of the findings will be published in the Autumn.

There is no word yet on which websites the ICO is set to consider, but this is yet another wake up call for businesses who haven’t started thinking about their public facing documents and policies to get cracking!

The announcement comes hot on the heels of updates to the enforcement section of the ICO’s website which show that the UK e-privacy enforcement space is certainly heating up and Google’s updates to its privacy policy in an attempt to comply with EU cookie consent rules.  Internal stakeholders who might be resistant to yet another review of an often overlooked part of any businesses website should be reminded that transparency is very likely to continue to be at the heart of the new European data protection framework.  It is most definitely time to get a head start now.

In defence of the privacy policy

Posted on March 29th, 2013 by



Speaking at the Games Developers’ Conference in San Francisco yesterday on the panel “Privacy by [Game] Design”, I was thrown an interesting question: Does the privacy policy have any place in the forward-thinking privacy era?

To be sure, privacy policy bashing has become populist stuff in recent years, and the role of the privacy policy is a topic I’ve heard debated many, many times. The normal conclusion to any discussion around this point is that privacy policies are too long, too complex and simply too unengaging for any individual to want to read them. Originally intended as a fair processing disclosure about what businesses do with individuals’ data, critics complain that they have over time become excessively lengthy, defensive, legalistic documents aimed purely to protect businesses from liability. Just-in-time notices, contextual notices, privacy icons, traffic lights, nutrition labels and gamification are the way forward. See, for example, this recent post by Peter Fleischer, Google’s Global Privacy Counsel.

This is all fair criticism. But that doesn’t mean it’s time to write-off privacy policies – we’re not talking an either/or situation here. They continue to serve an important role in ensuring organisational accountability. Committing a business to put down, in a single, documented place, precisely what data it collects, what it does with that data, who it shares it with, and what rights individuals have, helps keep it honest. More and more, I find that clients put considerable effort into getting their privacy policies right, carefully checking that the disclosures they make actually map to what they do with data – stimulating conversations with other business stakeholders across product development, marketing, analytics and customer relations functions. The days when lawyers were told “just draft something” are long gone, at least in my experience.

This internal dialogue keeps interested stakeholders informed about one another’s data uses and facilitates discussions about good practice that might otherwise be overlooked. If you’re going to disclose what you do in an all-encompassing, public-facing document – one that may, at some point, be scoured over by disgruntled customers, journalists, lawyers and regulators – then you want to make sure that what you do is legit in the first place. And, of course, while individuals seldom ever read privacy policies in practice, if they do have a question or a complaint they want to raise, then a well-crafted privacy policy serves (or, at least, should serve) as a comprehensive resource for finding the information they need.

Is a privacy policy the only way to communicate with your consumers what you do with their data? No, of course not. Is it the best way? Absolutely not: in an age of device and platform fragmentation, the most meaningful way is through creative Privacy by Design processes that build a compelling privacy narrative into your products and services. But is the privacy policy still relevant and important? Yes, and long may this remain the case.

How to solve BCR conflicts with local law

Posted on March 13th, 2013 by



A frequently asked question by many clients considering BCR is “How can we apply BCR on a global basis?  What if non-EU laws conflict with our BCR requirements?”  Normally, this question is raised during an early-stage stakeholder review – typically, by local in-house counsel or a country manager who points out, quite reasonably, that BCR are designed to meet EU data protection standards, not their own local laws.

It’s a very good, and perfectly valid, question to ask – but one that can very quickly be laid to rest.  BCR are a voluntary set of self-regulatory standards that can readily be designed to flex to non-EU local law requirements.  Global businesses necessarily have to comply with the myriad of different laws applicable to them, and the BCR policy can address this need in the following way:

(*)  where local law standards are lower than those in the BCR, then the BCR policy should specify that its standards will apply.  In this way, the local controller not only achieves, but exceeds, local law requirements and continues to meet its commitments under its BCR; and

(*)  where local law standards are higher than those in the BCR, then the BCR policy should specify that the local law standards will apply.  In this way, the local controller achieves local law compliance and exceeds its commitments under the BCR.

In both cases, the controller manages to fulfill its responsibilities under both applicable local law and the BCR, so a head on collision between the two almost never arises.  But for those very exceptional circumstances where mandatory local laws do prohibit the controller from complying with the BCR, then the group’s EU headquarters or privacy function is simply required to take a “responsible decision” on what action to take and consult with EU data protection authorities if in doubt.

The net result?  Carefully designed BCR provide a globally consistent data management framework that set an expected baseline level of compliance throughout the organization – exceeded only if and when required by local law.

Position of Spain on the General Data Protection Regulation: flexibility, common sense and self-regulation

Posted on March 7th, 2013 by



As expectation and concerns rise whilst we wait for the final position of the LIBE committee and the European Parliament on the General Data Protection Regulation (the “Regulation”), the report issued by the Spanish Ministry of Justice on the Regulation (the “Report”) and the recent statements of the Spanish Minister of Justice is music to our ears.

A few weeks ago the Spanish Minister of Justice expressed concern that SMEs could be ‘suffocated’ by the new data protection framework. This concern seems to have inspired some of the amendments suggested in the Report which are designed to make the Regulation more flexible. These include substantive changes to reduce the administrative burdens for organisations with a DPO or for those that have adhered to a certification scheme, and the calculation of fines on profits rather than turnover.

Spain favours a Regulation that relies on self-regulation and accountability, clearly steering away from a restrictive ‘one size fits all’ approach which establishes an onerous (and expensive to comply with) framework . The underlying objective of these proposals seems to be the protection of the SMEs at the core of the Spanish economy. A summary of the Spanish position is provided below:

- Regulation v Directive: there is agreement that a Regulation is the best instrument to standardise data protection within the EU. This is despite the fact that this will cause complications under Spanish Constitutional law.

- Data protection principles: the Report favours the language of the Data Protection Directive (which uses the expression “adequate, relevant and not excessive”) as it allows more flexibility than the language of the Regulation which refers to personal data being “limited to the minimum necessary”. In updating personal data, the Report suggests that this should only be required “whenever necessary” and depending upon its expected use as opposed to the general obligation currently set out by the Regulation.

- Information: the requirement to inform individuals about the period during which personal data will be kept is considered excessive and very difficult to comply with. The Report suggests that this should only be required “whenever it is possible”.

- Consent: the requirement of express consent is seen as too onerous in practice and “properly informed consent” is favoured, the focus being on whether individuals understand the meaning of their actions. The adoption of sector by sector solutions in this context is not ruled out.

- Right to be forgotten: this right is considered paramount but the point is made that a balance has to be found between “theoretical technological possibilities” and “real limitations”. Making an organisation solely responsible for the erasure of personal data which has been disseminated to third parties is regarded as excessive.

- Security incidents: various amendments to the articles that regulate breach notifications are suggested to introduce less stringent requirements to the proposed regime. The suggested amendments remove the duty to notify the controller within 24 hours and also limit the obligation to notify for serious breaches only. Notifications to data subjects are also limited to those that would not have a negative impact on the investigations.

- DPOs: it is proposed that the appointment of DPOs should not be compulsory but should be encouraged by incentives such as the suppression of certain administrative burdens (as referred to below). Organisations without the resources to appoint a DPO may also be encouraged to adopt a “flexible and rigorous” certification policy or scheme. Such certifications would be by sector, revocable and renewable.

- Documentation, impact assessments and prior authorisation: the suggested amendments propose a solution whereby organisations which hold a valid certificate or which have appointed a DPO, would not have to maintain documentation, carry out PIAs or request authorisation to data protection authorities as provided for by Articles 28.2, 33 and 34 of the Regulation respectively.

- International transfers: Spain favours the current system but suggests that this could be made more flexible by only requiring the authorisation of the data protection authority for contractual clauses (which have not been adopted by the Commission or an authority) when the organisation does not have a DPO or a certificate.

- One-stop-shop: this concept is endorsed in general but the Report proposes that where a corporation is established in more than one Member State, the DPA established in the country of residence of an individual complainant should have jurisdiction to deal with the matter. The consistency mechanism would be used to ensure a coherent decision where there were several similar complaints in different countries.

- Sanctions and alternatives: Spain considers that the current system could be improved by providing less stringent alternatives to the imposition of fines. Furthermore, it is proposed that the way in which sanctions are calculated is reviewed on the basis that annual turnover does not equal benefits obtained. This is to avoid the imposition of disproportionate sanctions.

- Technological neutrality: technological neutrality is supported although the Report expresses concerns that such neutrality does not provide for adequate solutions for particular challenges, such as those presented by cloud computing or the transfer of personal data over the Internet.

- Cloud computing: the Report suggests that the Regulation takes this “new reality” into account and suggests the adoption some measures, for example, those aimed at (1) finding a balance between the roles of controllers and processors in order to avoid cloud service providers becoming solely responsible for the processing of personal data; and (2) simplifying the rules on international transfers of personal data; for example, by extending binding corporate rules to the network of sub-processors.

Do BCR now, not later.

Posted on February 23rd, 2013 by



BCR are a big feature of the Commission’s proposed General Data Protection Regulation.  Previously a regulatory invention (the Article 29 Working Party first established a structure for BCR back in its 2003 paper WP74), the Commission has sought to put BCR on a solid legal footing by expressly recognising them as a solution for data exports under Articles 39 and 40 of the proposed Regulation.  The intent being that, by doing so, all EU Member States will uniformly have to recognise and permit global data transfers using BCR, solving the issue presented today where the national legal or regulatory regimes of one or two Member States inhibit their adoption. 

As if further poof were needed of the Commission’s support for BCR, Commissioner Viviane Reding has even gone so far as to say: “Indeed, I encourage companies of all size to start working on their own binding corporate rules!  Binding corporate rules are an open instrument: They are open to international interoperability. They are open to your innovations. They are open to improve data protection on a global scale, to foster citizens’ trust in the digital economy and unleash the full potential of our Single Market. And more: they are open to go beyond the geographical borders of Europe.

High praise indeed, and certainly Ms. Reding’s description of BCR matches with our own experience helping clients design and implement them.  Clients who implement BCR substantially simplify their global data movemments and embed a culture of respect for privacy that enhances compliance and drives down risk.

What the Regulation will really mean for BCR adoption

But here’s the thing: far from supporting BCR adoption, the Regulation will make authorisation of BCR harder to achieve, and this flies in the face of the Commission’s very express support for BCR.  

Historically, the main barrier to BCR adoption has been the bureacracy, effort and cost entailed in doing so – early BCR adopters tell war stories about their BCR approval process taking years and having to address conflicting requirements of multiple data protection authorities all over Europe.  This burdensome process arose out of a requirement that the BCR applicant needed to have its BCR individually authorised by every data protection authority from whose territory it exported data.

Thankfully, this is an area where huge strides forward have been achieved in recent years, through the implementation of the so-called “mutual recognition” procedure that allows BCR applicants to submit their BCR to a single lead authority;  once the lead authority approves the applicant’s BCR, it then becomes binding across all mutual recognition territories (currently 21 of the 27 EU Member States).  No more trekking around Europe visiting data protection authorities individually then.

Mutual recognition has really lifted BCR out of the dark ages into an age of BCR enlightenment, and has been vital to the upswing in BCR applications all over Europe.  Now, though, the proposed Regulation – despite its intended support for BCR – threatens to actually inhibit their adoption, pushing controllers back to using “check box” solutions like model clauses that provide little in the way of real protection.

Why?  Because under the draft Regulation, any authority wishing to approve BCR must first refer the matter to the European Data Protection Board under the Regulation’s proposed “consistency mechanism” (designed to ensure consistency of decision making by authorities across Europe).  The European Data Protection Board can be thought of as the “Article 29 Working Party Plus”, and comprises the head of each data protection authority across Europe and the Data Protection Supervisor.  In effect, the consistency mechanism necessitates that an applicant’s BCR must once again be tabled before every data protection authority before authoristion can be granted – a step backwards, not forwards.  As the ICO noted in its initial analysis of the Regulation: “It is not entirely clear what would happen if, for example, the UK supervisory authority were to approve a set of binding corporate rules but, once informed of the approval, the EDPB takes issue with it.

To make things worse, it’s not clear how the consistency mechanism will sit with the mutual recognition procedure we have today.  Maybe it will supersede the mutual recognition procedure.  Maybe it will apply in addition.  Or maybe some kind of hybrid process will evolve.  We just don’t know and uncertainty is never a good thing. 

The time for BCR is now

What this means is that while BCR will remain the only realistic solution for multinationals exporting data on a global basis, the process for achieving them once the Regulation comes into effect will become much tougher.  Add to this that the fact that, as a whole, the Regulation will impose stricter data protection standards than exist under the Directive, and BCR applications will attract an even greater level of scrutiny once the Regulation comes into effect than they do today.

So given that there is strong regulatory support for BCR, but that the Regulation will create barriers to adoption, what strategy should multinational conrtollers adopt? 

The answer is simple: do BCR now, not later. 

The process for achieving today BCR is more streamlined than it’s ever been and BCR authorised now will remain in effect once the new Regulation becomes law.   When you look at it like that, why not do BCR now?