Archive for January, 2013

Killing the Internet

Posted on January 25th, 2013 by



The beginning of 2013 could not have been more dramatic for the future of European data protection.  After months of deliberations, veiled announcements and guarded statements, the rapporteur of the European Parliament’s committee responsible for taking forward the ongoing legislative reform has revealed his position loudly and clearly.  Jan Albrecht’s proposal is by no means the final say of the Parliament but it is an indication of where an MEP who has thought long and hard about what the new data protection law should look like stands.  The reactions have been equally loud.  The European Commission has calmly welcomed the proposal, whilst some Member States’ governments have expressed serious concerns about its potential impact on the information economy.  Amongst the stakeholders, the range of opinions vary quite considerably – Albrecht’s approach is praised by regulators whilst industry leaders have massive misgivings about it.  So who is right?  Is this proposal the only possible way of truly protecting our personal information or have the bolts been tightened too much?

There is nothing more appropriate than a dispassionate legal analysis of some key elements of Albrecht’s proposal to reveal the truth: if the current proposal were to become law today, many of the most popular and successful Internet services we use daily would become automatically unlawful.  In other words, there are some provisions in Albrecht’s draft proposal that when combined together would not only cripple the Internet as we know it, but they would stall one of the most promising building blocks of our economic prosperity, the management and exploitation of personal information.  Sensationalist?  Consider this:

*     Traditionally, European data protection law has required that in order to collect and use personal data at all, one has to meet a lawful ground for processing.  The European Commission had intended to carry on with this tradition but ensuring that the so-called ‘legitimate interests’ ground, which permits data uses that do not compromise the fundamental rights and freedoms of individuals, remained available.  Albrecht proposes to replace this balancing exercise with a list of what qualifies as a legitimate interest and a list of what doesn’t.  The combination of both lists have the effect of ruling out any data uses which involve either data analytics or simply the processing of large amounts of personal data, so the obvious outcome is that the application of the ‘legitimate interests’ ground to common data collection activities on the Internet is no longer possible.

*     Albrecht’s aim of relegating reliance on the ‘legitimate interests’ ground to very residual cases is due to the fact that he sees individual’s consent as the primary basis for all data uses.  However, the manner and circumstances under which consent may be obtained are strictly limited.  Consent is not valid if the recipient is in a dominant market position.  Consent for the use of data is not valid either if presented as a condition of the terms of a contract and the data is not strictly necessary for the provision of the relevant service.  All that means that if a service is offered for free to the consumer – like many of the most valuable things on the Internet – but the provider of that service is seeking to rely on the value of the information generated by the user to operate as a business, there will not be a lawful way for that information to be used.

*     To finish things off, Albrecht delivers a killing blow through the concept of ‘profiling’.  Defined as automated processing aimed at analysing things like preferences and behaviour, it covers what has become the pillar of e-commerce and is set to change the commercial practices of every single consumer-facing business going forward.  However, under Albrecht’s proposal, such practices are automatically banned and only permissible with the consent of the individual, which as shown above, is pretty much mission impossible.

The collective effect of these provisions is truly devastating.  This is not an exaggeration.  It is the outcome of a simple legal analysis of a proposal deliberately aimed at restricting activities seen as a risk to people.  The decision that needs to be made now is whether such a risk is real or perceived and, in any event, sufficiently great to merit curtailing the development of the most sophisticated and widely used means of communication ever invented. 

 
This article was first published in Data Protection Law & Policy in January 2013.

UK Government’s take on the Regulation: Much to negotiate about

Posted on January 15th, 2013 by



Back in November 2012, we reported on the UK’s Justice Committee’s opinion on the European Commission’s proposals to reform the data protection legal framework. It was pretty clear from the opinion that the Justice Committee had significant reservations about the proposed regulation. Now the UK Government (through the Ministry of Justice) has issued its response to the Justice Committee’s opinion.

The response picks up on the conclusions set out by the Committee’s reports and provides the UK Government’s view. Overwhelmingly, the Government shares the concerns of the Committee. For instance, the Government argues that the proposed Regulation should be re-cast as a Directive which would provide greater flexibility for Member States where necessary. While supporting the aspiration of harmonisation and new principles in the draft Regulation such as the consistency mechanism, the Government states that data protection law should ‘secure individuals’ privacy without placing constraints on businesses practices that harm innovation and growth’.

The Government also has serious concerns about the potential economic consequences of the proposed Regulation and urges that a full assessment of the impact of the draft Regulation be carried out due to the additional administrative and compliance measures introduced. In that vein, the Government agrees with the Information Commissioner’s assessment that the system set out in the draft Regulation won’t work. The Government actively encourages interested parties to use the Government’s Impact Assessment to analyse the impact of the Regulation themselves and provide any feedback to the Ministry of Justice.  

Elsewhere the Government shares the Committee’s concerns around the right to be forgotten and the need for data protection authorities to have discretion when issuing sanctions, but disagrees with the Committee about charging a fee for subject access rights, arguing that organisations should continue to be able to charge a small fee.

Overall, the Government emphasises the need for a risk based data protection legislative model that moves away from the over-prescription in the Regulation and delivers a more proportionate and balanced approach. It stresses that the data protection framework should focus on regulating outcomes, not processes.

This response suggests that the UK Government is gearing up to take a tough negotiating stance on the proposed changes to the data protection legal framework. However, in view of the recent publication from the European Parliament’s rapporteur Jan Philipp Albrecht whose proposed changes to the draft Data Protection Regulation are ‘stricter, thicker and tougher’, negotiating changes to the proposed framework in line with the UK Government’s preferred position is likely to be hard work.

European Parliament’s take on the Regulation: Stricter, thicker and tougher

Posted on January 9th, 2013 by



 

If anyone thought that the European Commission’s draft Data Protection Regulation was prescriptive and ambitious, then prepare yourselves for the European Parliament’s approach. The much awaited draft report by the LIBE Committee with its revised proposal (as prepared by its rapporteur Jan-Philipp Albrecht) has now been made available and what was already a very complex piece of draft legislation has become by far the strictest, most wide ranging and potentially most difficult to navigate data protection law ever to be proposed.

This is by no means the end of the legislative process, but here are some of the highlights of the European Parliament’s proposal currently on the table:

*     The territorial scope of application to non EU-based controllers has been expanded, in order to catch those collecting data of EU residents with the aim of (a) offering goods or services (even if they are free) or (b) monitoring those individuals (not just their behaviour).

*     The concept of ‘personal data’ has also been expanded to cover information relating to someone who can be singled out (not just identified).

*     The Parliament has chosen to give an even bigger role to ‘consent’ (which must still be explicit), since this is regarded as the best way for individuals to control the uses made of their data. In turn, relying on the so-called ‘legitimate interests’ ground to process personal data has become much more onerous, as controllers must then inform individuals about such specific processing and the reasons why those legitimate interests override the interests or fundamental rights and freedoms of the individual.

*     Individuals’ rights have been massively strengthened across the board. For example, the right of access has been expanded by adding to it a ‘right to data portability’ and the controversial ‘right to be forgotten’ potentially goes even further than originally drafted, whilst profiling activities are severely restricted.

*     All of the so-called ‘accountability’ measures imposed on data controllers are either maintained or reinforced. For example, the obligation to appoint a data protection officer will kick in when personal data relating to 500 or more individuals is processed per year, and new principles such as data protection by design and by default are now set to apply to data processors as well.

*     The ‘one stop shop’ concept that made a single authority competent in respect of a controller operating across Member States has been considerably diluted, as the lead authority is now restricted to just acting as a single contact point.

*     Many of the areas that had been left for the Commission to deal with via ‘delegated acts’ are now either specifically covered by the Regulation itself (hence becoming more detailed and prescriptive) or left for the proposed European Data Protection Board to specify, therefore indirectly giving a legislative power to the national data protection authorities.

*     An area of surprising dogmatism is international data transfers, where the Parliament has added further conditions to the criteria for adequacy findings, placed a time limit of 2 years to previously granted adequacy decisions or authorisations for specific transfers (it’s not clear what happens afterwards – is Safe Harbor at risk?), reinforced slightly the criteria for BCR authorisations, and limited transfers to non-EU public authorities and courts.

*     Finally, with regard to monetary fines, whilst the Parliament gives data protection authorities more discretion to impose sanctions, more instances of possible breaches have been added to the most severe categories of fines.

All in all, the LIBE Committee’s draft proposal represents a significant toughening of the Commission’s draft (which was already significantly tougher than the existing data protection directive). Once it is agreed by the Parliament, heated negotiations with the Council of the EU and other stakeholders (including the Commission itself) will then follow and we have just over a year to get the balance right. Much work no doubt awaits.

 

2013 to be the year of mobile regulation?

Posted on January 4th, 2013 by



After a jolly festive period (considerably warmer, I’m led to understand, for me in Palo Alto than for my colleagues in the UK), the New Year is upon us and privacy professionals everywhere will no doubt be turning their minds to what 2013 has in store for them.  Certainly, there’s plenty of developments to keep abreast of, ranging from the ongoing EU regulatory reform process through to the recent formal recognition of Binding Corporate Rules for processors.  My partner, Eduardo Ustaran, has posted an excellent blog outlining his predictions here.

But one safe bet for greater regulatory attention this year is mobile apps and platforms.  Indeed, with all the excitement surrounding cookie consent and EU regulatory reform, mobile has remained largely overlooked by EU data protection authorities to date.  Sure, we’ve had the Article 29 Working Party opine on geolocation services and on facial recognition in mobile services.  The Norwegian Data Protection Inspectorate even published a report on mobile apps in 2011 (“What does your app know about you?“).  But really, that’s been about it.  Pretty uninspiring, not to mention surprising, when consumers are fast abandoning their creaky old desktop machines and accessing online services through shiny new smartphones and tablets: Forbes even reports that mobile access now accounts for 43% of total minutes spent on Facebook by its users.

Migration from traditional computing platforms to mobile computing is not, in and of itself, enough to guarantee regulator interest.  But there are plenty of other reasons to believe that mobile apps and platforms will come under increased scrutiny this year:

1.  First, meaningful regulatory guidance is long overdue.  Mobiles are inherently more privacy invasive than any other computing platform.  We entrust more data to our mobile devices (in my case, my photos, address books, social networking, banking and shopping account details, geolocation patterns, and private correspondence) than any other platform and generally with far less security – that 4 digit PIN really doesn’t pass muster.  We download apps from third parties we’ve often scarcely ever heard of, with no idea as to what information they’re going to collect or how they’re going to use it, and grant them all manner of permissions without even thinking – why, exactly, does that flashlight app need to know details of my real-time location?  Yet despite the huge potential for privacy invasion, there persists a broad lack of understanding as to who is accountable for compliance failures (the app store, the platform provider, the network provider or the app developer) and what measures they should be implementing to avoid privacy breaches in the first place.  This uncertainty and confusion makes regulatory involvement inevitable.

2.  Second, regulators are already beginning to get active in the mobile space – if this were not the case, the point above would otherwise be pure speculation.  It’s not, though.  On my side of the Pond, we’ve recently seen the California Attorney General file suit against Delta Air Lines for its failure to include a privacy policy within its mobile app (this action itself following letters sent by the AG to multiple app providers warning them to get their acts together).  Then, a few days later, the FTC launched a report on children’s data collection through mobile apps, in which it indicated that it was launching multiple investigations into potential violations of the Children’s Online Privacy Protection Act (COPPA) and the FTC Act’s unfair and deceptive practices regime.  The writing is on the wall, and it’s likely EU regulators will begin following the FTC’s lead.

3.  Third, the Article 29 Working Party intends to do just that.  In a press release in October, the Working Party announced that “Considering the rapid increase in the use of smartphones, the amount of downloaded apps worldwide and the existence of many small-sized app-developers, the Working Party… [will] publish guidance on mobile apps… early next year.” So guidance is coming and, bearing in mind that the Article 29 Working Party is made up of representatives from national EU data protection authorities, it’s safe to say that mobile privacy is riding high on the EU regulatory agenda.

In 2010, the Wall Street Journal reported: “An examination of 101 popular smartphone “apps”—games and other software applications for iPhone and Android phones—showed that 56 transmitted the phone’s unique device ID to other companies without users’ awareness or consent. Forty-seven apps transmitted the phone’s location in some way. Five sent age, gender and other personal details to outsiders… Many apps don’t offer even a basic form of consumer protection: written privacy policies. Forty-five of the 101 apps didn’t provide privacy policies on their websites or inside the apps at the time of testing.”  Since then, there hasn’t been a great deal of improvement.  My money’s on 2013 being the year that this will change.