Archive for the ‘Data security’ Category

Part 1: Cutting through the Internet of Things hyperbole

Posted on October 15th, 2014 by



I’ve held back writing anything about the Internet of Things (or “IoT“) because there are so many developments playing out in the market. Not to mention so much “noise”.

Then something happened: “It’s Official: The Internet Of Things Takes Over Big Data As The Most Hyped Technology” read a Forbes headline. “Big data”, last week’s darling, is condemned to the “Trough of Disillusionment” while Gartner moves IoT to the very top of its 2014 emerging technologies Hype Cycle. Something had to be said.

The key point for me is that the IoT is “emerging”. What’s more, few are entirely sure where they are on this uncharted journey of adoption. IoT has reached an inflexion point and a point where businesses and others realise that identifying with the Internet of Things may drive sales, shareholder value or merely kudos. We all want a piece of this pie.

In Part 1 of this two part exploration of IoT, I explore what the Internet of Things actually is.

IoT –what is it?

Applying Gartner’s parlance, one thing is clear; when any tech theme hits the “Peak of Expectations” the “Trough of Disillusionment” will follow because, as with any emerging technology, it will be sometime until there is pervasive adoption of IoT. In fact, for IoT, Gartner says widespread adoption could be 5 to 10 years away. However, this inflexion point is typically the moment in time when the tech industry’s big guns ride into town and, just as with cloud (remember some folk trying to trade mark the word?!), this will only drive further development and adoption. But also further hype.

The world of machine to machine (“M2M“) communications involved the connection of different devices which previously did not have the ability to communicate. For many, the Internet of Things is something more, as Ofcom (the UK’s communications regulator) set out in its UK consultation, IoT is a broader term, “describing the interconnection of multiple M2M applications, often enabling the exchange of data across multiple industry sectors“.

The Internet of Things will be the world’s most massive device market and save companies billions of dollars” shouted Business Week in October 2014, happy to maintain the hype but also acknowledging in its opening paragraph that IoT is “beginning to grow significantly“. No question, IoT is set to enable large numbers of previously unconnected devices to connect and then communicate sharing data with one another. Today we are mainly contemplating rather than experiencing this future.

But what actually is it?

The emergence of IoT is driving some great debate. When assessing what IoT is and what it means for business models, the law and for commerce generally, arguably there are more questions than there are answers. In an exploratory piece in ZDNET Richie Etwaru called out a few of these unanswered questions and prompted some useful debate and feedback. The top three questions raised by Ritchie were:

  • How will things be identified? – believing we have to get to a point where there are standards for things to be sensed and connected;
  • What will the word trust mean to “things” in IoT? – making the point we need to redefine trust in edge computing; and
  • How will connectivity work? – Is there something like IoTML (The Internet of Things Markup Language) to enable trust and facilitate this communication?

None of these questions are new, but his piece reinforces that we don’t quite know what IoT is and how some of its technical questions will be addressed. It’s likely that standardisation or industry practice and adoption around certain protocols and practices will answer some of these questions in due course. As a matter of public policy we may see law makers intervene to shape some of these standards or drive particular kinds of adoption. There will be multiple answers to the “what is IoT?” question for some time. I suspect in time different flavours and business models will come to the fore. Remember when every cloud seminar spent the first 15 minute defining cloud models and reiterating extrapolations for the future size of the cloud market? Brace yourselves!

I’ve been making the same points about “cloud” for the past 5 years – like cloud the IoT is a fungible concept. So, as with cloud, don’t assume IoT has definitive meaning. As with cloud, don’t expect there is any specific Internet of Things law (yet?). As Part 2 of this piece will discuss, law makers have spotted there’s something new which may need regulatory intervention to cultivate it for the good of all but they’ve also realised  that there’s something which may grow with negative consequences – something that may need to be brought into check. Privacy concerns particularly have raised their head early and we’ve seen early EU guidance in an opinion from the Article 29 Working Party, but there is still no specific IoT law. How can there be when there is still little definition?

Realities of a converged world

For some time we’ve been excited about the convergence of people, business and things. Gartner reminds us that “[t]he Internet of Things and the concept of blurring the physical and virtual worlds are strong concepts in this stage. Physical assets become digitalized and become equal actors in the business value chain alongside already-digital entities“.   In other words; a land of opportunity but an ill-defined “blur” of technology and what is real and merely conceptual within our digital age.

Of course the IoT world is also a world bumping up against connectivity, the cloud and mobility. Of course there are instances of IoT out there today. Or are there? As with anything that’s emerging the terminology and definition of the Internet of Things is emerging too. Yes there is a pervasiveness of devices, yes some of these devices connect and communicate, and yes devices that were not necessarily designed to interact are communicating, but are these examples of the Internet of Things? Break these models down into constituent parts for applied legal thought and does it necessarily matter?

Philosophical, but for a reason

My point? As with any complex technological evolution, as lawyers we cannot apply laws, negotiate contracts or assess risk or the consequences for privacy without a proper understanding of the complex ecosystem we’re applying these concepts to. Privacy consequences cannot be assessed in isolation and without considering how the devices, technology and data actually interact. Be aware that the IoT badge means nothing legally and probably conveys little factual information around “how” something works. It’s important to ask questions. Important not to assume.

In Part 2 of this piece I will discuss some early signs of how the law may be preparing to deal with all these emerging trends? Of course the answer is that it probably already does and it probably has the flexibility to deal with many elements of IoT yet to emerge.

Creating a successful data retention policy

Posted on April 22nd, 2014 by



With the excitement generated by the recent news that the European Court of Justice has, in effect, struck down the EU’s Data Retention Directive (see our earlier post here), now seems as a good a time as any to re-visit the topic of data retention generally.

Whereas the Data Retention Directive required ISPs and telcos to hold onto communications metadata, the Data Protection Directive is sector-blind and pulls in exactly the opposite direction: put another way, it requires all businesses not to hold onto personal data for longer than is “necessary”.

That’s the kind of thing that’s easy for a lawyer to say, but difficult to implement in practice.  How do you know if it’s “necessary” to continue holding data?  How long does “necessary” last?  How do you explain to internal business stakeholders that what they consider “necessary” (i.e. commercially desirable) is not the same thing as what the law considers “necessary”?

Getting the business on-side

For any CPO, compliance officer or in-house lawyer looking to create their company’s data retention policy, you’ll need to get the business on-side.  Suggesting to the business that it deletes valuable company data after set periods of time may not initially be well-received but, for your policy to be a success, you’ll ultimately need the business’s support.

To get this buy-in, you need to communicate the advantages of a data retention policy and, fortunately, these are numerous.  Consider, for example:

  • Reduced IT expenditure:  By deleting data at defined intervals, you reduce the overall amount of data you’ll be storing.  That in turn means you need fewer systems to host that data, less archiving, back-ups and offsite storage, making significant cost savings and keeping your CFO happy.
  • Improved security:  It seems obvious, but it’s amazing how often this is overlooked.  The less you hold, the less – frankly – you have to lose.  Nobody wants to be making a data breach notification to a regulator AND explaining why they were continuing to hold on to 20 year old records in the first place.
  • Minimised data disclosures:  Most businesses are familiar with the rights individuals have to request access to their personal information, as well as the attendant business disruption these requests can cause.  As with the above point, the less data you hold, the less you’ll need to disclose in response to one of these requests (meaning the less effort – and resource – you need to put into finding that data).  This holds true for litigation disclosure requests too.
  • Legal compliance:  Last, but by no means least, you need a data retention policy for legal compliance – after all, it’s the law not to hold data for longer than “necessary”.  Imagine a DPA contacting you and asking for details of your data retention policy.  It would be a bad place to be in if you didn’t have something ready to hand over.  

Key considerations

Once you have persuaded the business that creating a data retention policy is a good idea, the next task is then to go off and design one!  This will involve input from various internal stakeholders (particularly IT staff) so it’s important you approach them with a clear vision for how to address some of the critical retention issues.

Among the important points to consider are:

  • Scope of the policy:  What data is in-scope?  Are you creating a data retention policy just for, say, HR data or across all data processed by the business?  There’s a natural tension here between achieving full compliance and keeping the project manageable (i.e. not biting off more than you can chew).  It may be easier to “prove” that your policy works on just one dataset first and then roll it out to additional, wider datasets later.
  • One-size-fits-all vs. country-by-country approach:  Do you create a policy setting one-size-fits-all retention limits across all EU (possibly worldwide) geographies, or set nationally-driven limits with the result that records kept for, say, 6 years in one country must be deleted after just two in another?  Again, the balance to be struck here is between one of compliance and risk versus practicality and ease of administration.
  • Records retention vs. data retention:  Will your policy operate at the “record” level or the “data” level?  The difference is this: a record (such as a record of a customer transaction) may comprise multiple data elements (e.g. name, cardholder number, item purchased, date etc.)  A crucial decision then is whether your policy should operate at the “record” level (so that the entire customer transaction record is deleted after [x] years) or at the “data”  level (so that, e.g., the cardholder number is deleted after [x] years but other data elements are kept for a longer period).  This is a point where it is particularly important to discuss with IT stakeholders what is actually achievable.
  • Maximum vs minimum retention periods:  Apart from setting maximum data retention periods, there may be  commercial, legal or operational reasons for the business to want to set minimum retention periods as well – e.g. for litigation defence purposes.  At an early stage, you’ll need to liaise with colleagues in HR, IT, Accounting and Legal teams to identify whether any such reasons exist and, if so, whether these should be reflected in your policy.
  • Other relevant considerations:  What other external factors will impact the data retention policy you design? Aside from legal and commercial requirements, is the business subject to, for example, sector-specific rules, agreements with local Works’ Councils, or even third party audit requirements (e.g. privacy seal certifications – particularly common in Germany)?  These factors all need to be identified and their potential impact on your data retention policy considered at an early stage.   

Getting it right at the beginning means that the subsequent stages of your data retention policy design and roll out should become much smoother – you’ll get the support you need from the business and you’ll have dealt with the difficult questions in a considered, strategic way upfront rather than in a piecemeal (and likely, inconsistent) fashion as the policy evolves.

And with so much to benefit from adopting a retention policy, why would you wait any longer?

Beware: Europe’s take on the notification of personal data breaches to individuals

Posted on April 10th, 2014 by



Article 29 Working Party (“WP 29“) has recently issued an Opinion on Personal Data Breach Notification (the “Opinion“). The Opinion focuses on the interpretation of the criteria under which individuals should be notified about the breaches that affect their personal data.

Before we analyse the take aways from the Opinion, let’s take a step back: are controllers actually required to notify personal data breaches?

In Europe, controllers have, for a while now, been either legally required or otherwise advised to consider notifying personal data breaches to data protection regulators and/or subscribers or individuals.

Today, the only EU-wide personal data breach notification requirement derives from Directive 2002/58/EC, as amended by Directive 2009/136/EC, (the “e-Privacy Directive“) and  applies to providers of publicly available electronic communications services. In some EU member states (for example, in Germany), this requirement has been extended to controllers in other sectors or to all  controllers. Similarly, some data protection regulators have issued guidance whereby controllers are advised to report data breaches under certain circumstances.

Last summer, the European Commission adopted Regulation 611/2013 (the “Regulation“), (see our blog regarding the Regulation here), which  sets out the technical implementing measures concerning the circumstances, format and procedure for data breach notification required under Article 4 of the e-Privacy Directive.

In a nutshell, providers  must notify individuals of breaches that are likely to adversely affect their personal data or privacy without undue delay and taking account of: (i) the nature and content of the personal data concerned; (ii) the likely consequences of the personal data breach for the individual concerned (e.g. identify theft, fraud, distress, etc); and (iii) the circumstances of the personal data breach. Providers are exempt to notify individuals (not regulators) if they have demonstrated to the satisfaction of the data protection regulator that they have implemented appropriate technological protection measures to render that data unintelligible to any person who is not authorised to access it.

The Opinion provides guidance on how controllers may interpret this notification requirement by analysing 7 practical scenarios of breaches that will meet the ‘adverse effect’ test. For each of them, the  WP 29 identifies the potential consequences and adverse effects of the breach and the security safeguards which might have reduced the risk of the breach occurring in the first place or, indeed, might have exempted the controller from notifying the breach to individuals all together.

From the Opinion, it is worth highlighting:

The test. The ‘adverse effect’ test is interpreted broadly to include ‘secondary effects’. The  WP 29 clearly states that all the potential consequences and potential adverse effects are to be taken into account. This interpretation may be seen a step too far as not all ‘potential’ consequences are ‘likely’ to happen and will probably lead to a conservative interpretation of the notification requirement across Europe.

Security is key. Controllers should put in place security measures that are appropriate to the risk presented by the processing with emphasis on the implementation of those controls rendering data unintelligible. Compliance with data security requirements should result in the mitigation of the risks of personal data breaches and even, potentially, in the application of the exception to notify individuals about the breach. Examples of security measures that are identified to be likely to reduce the risk of a breach occurring are: encryption (with strong key); hashing (with strong key), back-ups, physical and logical access controls and regular monitoring of vulnerabilities.

Procedure. Controllers should have procedures in place to manage personal data breaches. This will involve a detailed analysis of the breach and its potential consequences. In the Opinion, the  data breaches fall under three categories, namely, availability, integrity or confidentiality breaches. The application of this model may help controllers analyse the breach too.

How many individuals? The number of individuals affected by the breach should not have a bearing on the decision of whether or not to notify them.

Who must notify? It is explicitly stated in the Opinion that breach notification constitutes good practice for all controllers, even for those who are currently not required to notify by law.

There is a growing consensus in Europe that it is only a matter of time before an EU-wide personal data breach notification requirement that applies to all controllers (regardless of the sector they are in) is in place. Indeed, this will be the case if/when the proposed General Data Protection Regulation is approved. Under it, controllers would be subject to strict notification requirements both to data protection regulators and individuals. This Opinion provides some insight into  how the  European regulators may interpret these requirements under the General Data Protection Regulation.

Therefore, controllers will be well-advised to prepare for what is coming their way (see previous blog here). Focus should be on the application of security measures (in order to prevent a breach and the adverse effects to individuals once a breach has occurred) and on putting procedures in place to effectively manage breaches. Start today, burying the head in the sand is just no longer an option.

Progress update on the EU Cybersecurity Strategy

Posted on March 13th, 2014 by



Background

On 28 February 2014, the European Commission hosted a “High Level Conference on the EU Cybersecurity Strategy” in Brussels.  The conference provided an opportunity for EU policy-makers, industry representatives and other interested parties to assess the progress of the EU Cybersecurity Strategy, which was adopted by the European Commission on 7 February 2013.

Keynote speech by EU Digital Agenda Commissioner Neelie Kroes

The implementation of the EU Cybersecurity Strategy comes at a time when public and private actors face escalating cyber threats.  During her keynote speech at the conference, Commissioner Kroes reiterated the dangers of weak cybersecurity measures by asserting that “without security, there is no privacy.

She further highlighted the reputational and financial impact of cyber threats, commenting that over 75% of small businesses and 93% of large businesses have suffered a cyber breach, according to a recent study.  However, Commissioner Kroes also emphasised that effective EU cybersecurity practices could constitute a commercial advantage for the 28 MemberState bloc in an increasingly interconnected global marketplace.

Status of the draft EU Cybersecurity Directive

The EU Cybersecurity Strategy’s flagship legal instrument is draft Directive 2013/0027 concerning measures to ensure a high common level of network and information security across the Union (“draft EU Cybersecurity Directive”).  In a nutshell, the draft EU Cybersecurity Directive seeks to impose certain mandatory obligations on “public administrations” and “market operators” with the aim of harmonising and strengthening cybersecurity across the EU. In particular, it includes an obligation to report security incidents to the competent national regulator.

The consensus at the conference was that further EU institutional reflection is required on some aspects of the draft EU Cybersecurity Directive, such as (1) the scope of obligations, i.e., which entities are included as “market operators”; (2) how Member State cooperation would work in practice; (3) the role of the National Competent Authorities’ (“NCAs”); and (4) the criminal dimension and notification requirement to law enforcement authorities by NCAs.  The scope of obligations is a particularly contentious issue as EU decision-makers consider whether to include certain entities, such as software manufacturers, hardware manufacturers, and internet platforms, within the scope of the Directive.

The next few months will be a crucial period for the legislative passage of the draft law.  Indeed, the European Parliament voted on 13 March 2014 in the Plenary session to adopt its draft Report on the Directive.  The Council will now spend March – May 2014 working on the basis of the Parliament’s report to achieve a Council “common approach”.  The dossier will then likely be revisited after the European Parliament elections in May 2014.  The expected timeline for adoption remains “December 2014″ but various decision-making scenarios are possible depending on the outcome of the elections.

Once adopted, Member States will have 18 months to transpose the Directive into national law (meaning an approximate deadline of mid-2016).  As a minimum harmonisation Directive, Member States could go beyond the provisions of the adopted Directive with their national transpositions, for instance, by reinstating internet platforms within the definition of a “market operator”. 

One of the challenges for organizations will be achieving compliance with possibly conflicting notification requirements between the draft EU Cybersecurity Directive (i.e., obligation to report security incidents to the competent national regulator), the existing ePrivacy Directive (i.e., obligation for telecom operators to notify personal data breaches to the regulator and to individuals affected) and, if adopted, the EU Data Protection Regulation (i.e., obligation for all data controllers to notify personal data security breaches to the regulator and to individuals affected).  So far, EU legislators have not provided any guidance as to how these legal requirements would coexist in practice.

Industry’s perspective on the EU Cybersecurity Strategy

During the conference, representatives from organisations such as Belgacom and SWIFT highlighted the real and persistent threat facing companies. Calls were made for international coordination on cybersecurity standards and laws to avoid conflicting regulatory requirements.  Interventions also echoed the earlier sentiments of Commissioner Kroes in that cybersecurity offers significant growth opportunities for EU industry. 

Business spoke of the need to “become paranoid” about the cyber threat and implement “security by design” to protect data.  Finally, trust, collaboration and cooperation between Member States, public and private actors were viewed as essential to ensure EU cyber resilience.

The Privacy Regulatory Bear Market and playing political football with business

Posted on January 23rd, 2014 by



2014 has kicked off in very dramatic fashion on the privacy law regulatory enforcement front; the French data protection regulator, CNIL, has just fined Google €150,000 for alleged Privacy Policy failings, an amount described as ‘pocket money’ by the EU politician who is in charge of toughening up European data protection law, European Commissioner Reding; the FTC, the US consumer protection regulator, has just taken disciplinary action against a number of US companies that have breached the ‘Safe Harbor’ agreement between the EU and US on the export of personal data from Europe to the US. So, what’s going on here?

Looking at the bigger picture of privacy law enforcement, penalties and sanctions, the climate has been getting worse for businesses year-on-year; the cycle of tougher regulatory responses to privacy problems began around 2006. The regulatory rhetoric has also been getting stronger and darker over the cycle.

The bigger picture tells us that there is a ‘Regulatory Bear Market’ right at the beating heart of the international privacy law system. Like a financial bear market, this is the consequence of negative sentiment, pessimism and a loss of confidence, in the sense that privacy law regulators are downbeat about the performance of businesses when it comes to compliance with their privacy law obligations. This leads to negative and adverse outcomes, including the imposition of large financial penalties and negative rhetoric in press statements, television appearances and guidance and policy documents.

In Europe, the most visible fruit of the Regulatory Bear Market is the current law reform process led by Commissioner Reding, which will toughen up data protection law in ways that most businesses have not yet adjusted to. For instance, fines of up to 5% of the annual worldwide turnover of the business may be imposed. Translating this threatened change into real monetary values has been hard up until now, but Commissioner Reding has just said that the Google fine might be as much as $1bn under the new regime, a staggering sum, which is sure to water the eyes of Chief Financial Officers everywhere.

If that wasn’t bad enough, it seems that the business community may be forced to pay the price for the government failings revealed by Edward Snowdon. There is plenty of evidence out there already to suggest that the corporate world is becoming the football in the political game that is being played out between the EU, other countries and the US as a result of Snowdon’s disclosures.

One piece of evidence is the ‘Euro Cloud’ idea, which seems to be very popular in certain parts of the European Parliament. This idea says that in order to prevent US snooping on online activities and electronic communications, personal data of European citizens should be kept in European data centres. Regardless of whether Euro Cloud could ever stop snooping, which many experts doubt, the key significance of the idea is that businesses will have to change their business models because of the actions of governments over which they have had no control. The capital cost of doing this will be born by business, not the policitians who back the idea, or the governments who are carrying out snooping. The underlying threat, of course, is that businesses that do not play ball will be faced with sanctions. Governments commit the crimes, businesses pay the fines.

Another example is the FTC action mentioned earlier. How very convenient it is to make examples of businesses at exactly the time when, due to the Snowdon disclosures, the Safe Harbour data export rules that they are accused of breaching are being re-examined by EU politicians for fitness for purpose. It might look to some observers as if the US regulator is willing to sacrifice some US companies on the altar of European political opinion simply to sate the lust for blood.

The corporate world has always been the football in critical political games and business leaders will be resigned to this as being a natural and inevitable facet of being in business. What they may not have factored in to their business plans and balance sheets is that the game is now playing out over personal data and privacy. If not, they need to re-adjust quickly, otherwise the Regulatory Bear Market will bite them.

FTC in largest-ever Safe Harbor enforcement action

Posted on January 22nd, 2014 by



Yesterday, the Federal Trade Commission (“FTC“) announced that it had agreed to settle with 12 US businesses for alleged breaches of the US Safe Harbor framework. The companies involved were from a variety of industries and each handled a large amount of consumer data. But aside from the surprise of the large number of companies involved, what does this announcement really tell us about the state of Safe Harbor?

This latest action suggests that the FTC is ramping up its Safe Harbor enforcement in response to recent criticisms from the European Commission and European Parliament about the integrity of Safe Harbor (see here and here) – particularly given that one of the main criticisms about the framework was its historic lack of rigorous enforcement.

Background to the current enforcement

So what did the companies in question do? The FTC’s complaints allege that the companies involved ‘deceptively claimed they held current certifications under the U.S.-EU Safe Harbor framework‘. Although participation in the framework is voluntary, if you publicise that you are Safe Harbor certified then you must, of course, maintain an up-to-date Safe Harbor registration with the US Department of Commerce and comply with your Safe Harbor commitments 

Key compliance takeaways

In this instance, the FTC alleges that the businesses involved had claimed to be Safe Harbor certified when, in fact, they weren’t. The obvious message here is don’t claim to be Safe Harbor certified if you’re not!  

The slightly more subtle compliance takeaway for businesses who are correctly Safe Harbor certified is that they should have in place processes to ensure:

  • that they keep their self-certifications up-to-date by filing timely annual re-certifications;
  • that their privacy policies accurately reflect the status of their self-certification – and if their certifications lapse, that there are processes to adjust those policies accordingly; and
  • that the business is fully meeting all of its Safe Harbor commitments in practice – there must be actual compliance, not just paper compliance.

The “Bigger Picture” for European data exports

Despite this decisive action by the FTC, European concerns about the integrity of Safe Harbor are likely to persist.  If anything, this latest action may serve only to reinforce concerns that some US businesses are either falsely claiming to be Safe Harbor certified when they are not or are not fully living up to their Safe Harbor commitments. 

The service provider community, and especially cloud businesses, will likely feel this pressure most acutely.  Many customers already perceive Safe Harbor to be “unsafe” for data exports and are insisting that their service providers adopt other EU data export compliance solutions.  So what other solutions are available?

While model contract have the benefit of being a ‘tried and tested’ solution, the suite of contracts required for global data exports is simply unpalatable to many businesses.  The better solution is, of course, Binding Corporate Rules (BCR) – a voluntary set of self-regulatory policies adopted by the businesses that satisfy EU data protection standards and which are submitted to, and authorised by, European DPAs.  Since 2012, service providers have been able to adopt processor BCR, and those that do find that this provides them with a greater degree of flexibility to manage their internal data processing arrangements while, at the same time, continuing to afford a high degree of protection for the data they process.       

It’s unlikely that Safe Harbor will be suspended or disappear – far too many US businesses are dependent upon it for their EU/CH to US data flows.  However, the Safe Harbor regime will likely change in response to EU concerns and, over time, will come under increasing amounts of regulatory and customer pressure.  So better to consider alternative data export solutions now and start planning accordingly rather than find yourself caught short!

 

Cyber: Safety first!

Posted on November 12th, 2013 by



In case you haven’t noticed, the European Institutions (as well as the UK Government and those on the other side of the pond) have been ramping up their digital agendas in recent months, each seeking to instil the importance of cyber security on citizens and businesses alike. 

It’s all about raising cyber security awareness, but essentially the message is this: companies must understand their systems and data, and must take a proportionate, risk-based approach to keeping them secure.  They must build resilient networks and communications systems, and protect our critical infrastructures. As the threats against this landscape continue to increase, there is a corresponding decline in consumer trust, so what is important is to demonstrate you have the ability and agility to counter those threats and show you are committed to data and cyber security.  Ultimately that will build trust.

Raising cyber security awareness has no doubt been assisted somewhat by the recent “Snowden revelations” but it is very easy to get distracted by all the sensationalist headlines.  Despite what goes on in the law enforcement and intelligence worlds, we shouldn’t lose sight of the importance of building trust and building a strong and resilient digital economy.

This week in Germany the 2nd Cyber Security Summit took place, with a notable Keynote speech given by Neelie Kroes (the Vice President of the European Commission, Digital Agenda) about how to make Europe the world’s safest online environment.  A copy of the speech is available here.

Ms Kroes highlighted three trends that have appeared in the digital age.  Firstly, the recognition that the online world provides us all with huge benefits – let’s face it, we all use and rely on technologies every minute of every day.  But with these benefits comes the second trend; risks.  Cyber attacks, data loss, identify theft – the list goes on. 

The third trend then is that these risks ultimately lead to significant costs (both in terms of mitigating against risk and dealing with problems that risks result in).  Indeed, Ms Kroes points out the frequency of data security breaches suffered each year and says that the resulting costs (particularly for major incidents) “could amount to over a quarter of a trillion dollars“. 

That, I’m sure, isn’t an exaggeration.  The UK Information Commissioner can fine companies up to £500,000, and businesses must be shuddering at the thought of the €100m / 5% AWWT fines that are proposed under the latest draft of the EU Data Protection Regulation.  But that is just the fines themselves; what about all the other stuff?  The reality is that there are all sorts of other expenses such as outlays for detection of breaches, escalation, notification, after-event mitigation, containment and response, not to mention legal and other professional fees.

But with all that in mind, let’s also go back to Neelie Kroes’ “first trend” and think about all the benefits the digital world can offer.  Let’s make sure we can reap those benefits by building effective cyber defences into our business strategies.  That’s going to involve some investment, but it’s also going to provide a level of protection against many of the significant costs associated with a security incident.  And perhaps most importantly of all, demonstrating you are ahead of the game will help build trust; a vital commodity in today’s digital world.

If I’m not mistaken, that’s a breach….

Posted on November 4th, 2013 by



Last year the UK Information Commissioner (ICO) issued 25 fines (22 of which were for data security breaches).  This year ICO has issued 16 fines so far.  We’ll have to see what happens in the next two months but my guess is we’ll be seeing a fair few more fines in the run up to Christmas.

In a recent blog post on ICO’s website, we are told that the local government sector has received fines totalling more than £2million since the ICO’s fining power begun in 2010.  That’s a staggering amount of money which ultimately is paid out of the public purse (presumably to the detriment of the public services it was there to support). 

We are also told in the blog that “all these breaches” could have been prevented if the Data Protection Act had been correctly complied with.  I’m not sure I entirely agree with that statement; can total compliance really eliminate all risk of incidents occurring?

While it is true that organisations should implement rigorous data protection and information governance frameworks to help safeguard the data they handle (think “technical and organisational measures” required by the DPA), surely no amount of policies, guidance or training is going to prevent an accidental slip-up from occurring.  The unfortunate reality is that we humble human beings do make the occasional mistake.  We all know – or can imagine – how easy it is to misdial a number or click on ‘send’ and inadvertently send something to the wrong person.   Indeed, our 2012 ICO Enforcement Tracker (please get in touch for a copy) revealed that of all the fines issued by ICO last year the overwhelming majority were for breaches involving misdirected communications.

So practically speaking, what is the answer? 

Well, the best solution is surely to assess and manage the risks in the hope that you can ensure no harm or damage is suffered in the event an incident occurs.  The best thing you and your organisations can do is (i) sit up and pay a bit of attention to the types of data you handle; (ii) get fully up to speed with what your legal obligations are in relation to it; and (iii) implement a robust system to demonstrate not only that you are doing everything possible to avoid a breach occurring in the first place, but also so that you can be confident you have a proper action plan in place to manage an incident if and when it arises. 

It also goes without saying that we can learn an awful lot from the mistakes that others have already made; we know that the “hot spots” for regulatory action include things like misdirected communications, lack of policies and training, and the failure to encrypt portable media that contains personal information.  Organisations should exploit that knowledge and use it to build better and more effective breach management strategies.

The Commission combats the EU Data Residency rumours

Posted on October 21st, 2013 by



Last week, the European Commission published a memo entitled ‘What does the Commission mean by secure Cloud computing services in Europe?‘. The memo stems from the Commission’s 2012 strategy Unleashing the Potential of Cloud Computing in Europe‘ and addresses the growing concerns about the implications for the European cloud computing market following the PRISM revelations. It also provides insight into the hot topic of whether the Commission will introduce requirements for cloud providers to keep EU citizen’s data within European borders. 

The Commission has made it clear that its vision is for Europe to become the global leader in the cloud computing market particularly in relation to data protection and security. One of the Commission’s aims is to align the cloud market with the proposals contained in the EU data protection regulation, by establishing a single market for cloud computing. The Commission also strongly opposes the ‘Fortress Europe‘ approach to cloud computing and stresses the need for a uniform approach since undertaking separate national or regional initiatives threatens to fragment the market and weaken the EU’s strength in this area. The Commission’s memo also reiterates that ‘the fundamental principle at stake is the need to look beyond borders when it comes to cloud computing‘ – meaning that although it aims to promote a European single market for cloud services, its intention is not to require providers to host EU citizen’s data in Europe but to work across borders. It seems cloud providers who feared unachievable plans to keep data within Europe, can now breathe a sigh of relief.

As well as confirming its stance on EU data residency, the Commission’s memo recognises the increased importance of encouraging smaller European businesses and consumers to use the cloud with the aim of increasing productivity. It is hoped that although Europe is not recognised as a leader in this area yet, the Commission will be able to leverage the EU’s reputation for ‘relatively high standards of data protection, security, interoperability and transparency about service levels and government access to information‘ to help increase the use of the cloud within and out side of Europe. As a way of tackling the slow adoption of the cloud in Europe, the Commission plans to encourage EU-wide voluntary certification schemes to increase transparency and security in the cloud.  In other words, the Commission is looking to pro-competitive measures to help promote the European cloud market, rather than trying to ‘force’ European cloud development through onerous rule-making.

How achievable the Commission’s plans are to establish Europe as the world’s leading trusted cloud region will inevitably be impacted by the implementation of the EU data protection regulation (with the LIBE Committee’s vote on its amendment proposals taking place today – see here). But at least, for now, cloud providers have some much-needed comfort that the Commission has no plans to force them to start building additional data centres in the EU anytime soon.    

Belgian DPA overhauls enforcement strategy

Posted on October 21st, 2013 by



Belgium has long been one of the low risk EU Member States in terms of data protection enforcement. Aside from the fact that pragmatism can be considered part of a Belgian’s nature, this view was also due to the fact that the Belgian DPA, the Privacy Commission, could be termed as one of those so-called ‘toothless tigers’.

As De Standaard reports, it seems this is now about to change, with the Privacy Commission set to follow the example of the Dutch DPA by adopting a more severe enforcement strategy.

Until now, the Privacy Commission did not pro-actively investigate companies or sectors, despite the fact that the Belgian Privacy Act grants them such powers. However, the Privacy Commission has recently decided to establish a team of inspectors who will actively search for companies that process personal data in a non-compliant manner. It seems the Privacy Commission is finally adopting an approach which the CNIL has been applying for a number of years, with the idea being that each year a specific sector would be subject of increased scrutiny.

In addition, anticipating the adoption of the Regulation, the Privacy Commission has called upon the Belgian legislator to grant it more robust enforcement powers. Currently, if a company is found to be in breach of the Belgian data protection laws, the Privacy Commission has a duty to inform the public prosecutor. However, in practice criminal prosecution for data protection non-compliance is virtually non-existent and leads to de facto impunity.  This could drastically change if greater enforcement powers are granted to the Privacy Commission.

In the wake of the coming Regulation, this new enforcement strategy does not come as a surprise. In addition, earlier this year, Belgium faced a couple of high-profile mediatised data breach cases for the first time. Both the Ministry of Defense, the Belgian railroad company and recruting agency Jobat suffered a massive data leak. More recently, the massive hacking of Belgacom’s affiliate BICS gave rise to a lot of controversy. It would appear that these cases highlighted to the Privacy Commission the limits of its current powers .

However, if even a pragmatic DPA, such as the Privacy Commission, starts adopting a more repressive enforcement strategy, it is clear that the days of complacency are fading. Organisations processing personal data really cannot afford to wait until the Regulation becomes effective in the next few years. They will have to make sure they have done their homework immediately, as it seems the DPA’s won’t wait until the Regulation becomes effective to show their teeth.