Archive for the ‘Children Privacy’ Category

Technology issues that will shape privacy in 2013

Posted on December 13th, 2012 by

Making predictions as we approach a new year has become a bit of a tradition.  The degree of error is typically proportional to the level of boldness of those predictions, but as in the early days of weather forecasting, the accuracy expectations attached to big statements about what may or may not happen in today’s uncertain world are pretty low.  Having said that, it wouldn’t be particularly risky to assume that during 2013, the EU legislative bodies will be thinking hard about things like whether the current definition of personal data is wide enough, what kind of security breach should trigger a public disclosure, the right amount for monetary fines or the scope of the European Commission’s power to adopt ‘delegated acts’.  But whilst it is easy to get distracted by the fascinating data protection legislative developments currently taking place in the EU, next year’s key privacy developments will be significantly shaped by the equally fascinating technological revolution of our time.

A so far low profile issue from a regulatory perspective has been the ever growing mobile app phenomenon.  Like having a website in the late 90s, launching a mobile app has become a ‘must do’ for any self-respecting consumer-facing business.  However, even the simplest app is likely to be many times more sophisticated than the early websites and will collect much more useful and clever data about its users and their lifestyles.  That is a fact and, on the whole, apps are a very beneficial technological development for the 21st century homo-mobile.  The key issue is how this development can be reconciled with the current data protection rules dealing with information provision, grounds for processing and data proportionality.  Until now, technology has as usual led the way and the law is clumsily trying to follow, but in the next few months we are likely to witness much more legal activity on this front than what we have seen to date.

Mobile data collection via apps has been a focus of attention in theUSAfor a while but recent developments are a clue to what is about to happen.  The spark may well have been ignited by the California Attorney General who in the first ever legal action under the state’s online privacy law, is suing Delta Air Lines for distributing a mobile application without a privacy policy.  Delta had reportedly been operating its mobile app without a privacy policy since at least 2010 and did not manage to post one after being ordered by the authorities to do so.  On a similar although slightly more alarming note, children’s mobile game company Mobbles is being accused by the Center for Digital Democracy of violating COPPA, which establishes strict parental consent rules affecting the collection of children’s data.  These are unlikely to be isolated incidents given that app operators tend to collect more data than what is necessary to run the app.  In fact, these cases are almost certainly the start of a trend that will extend toEuropein 2013 and lead EU data protection authorities and mobile app developers to lock horns on how to achieve a decent degree of compliance in this environment.

Speaking of locking horns, next year (possibly quite early on) we will see the first instances of enforcement of the cookie consent requirement.  What is likely to be big about this is not so much the amount of the fines or the volume of enforcement actions, but the fact that we will see for real what the regulators’ compliance expectations actually are.  Will ‘implied consent’ become the norm or will websites suddenly rush to present their users with hard opt-in mechanisms before placing cookies on their devices?  Much would need to change for the latter to prevail but at the same time, the ‘wait and see’ attitude that has ruled to date will be over soon, as the bar will be set and the decision to comply or not will be based purely on risk – an unfortunate position to be in, caused by an ill-drafted law.  Let that be a lesson for the future.

The other big technological phenomenon that will impact on privacy and security practices – probably in a positive way – will be the cloud.  Much has been written on the data protection implications of cloud computing in the past months.  Regulators have given detailed advice.  Policy makers have made grand statements.  But the real action will be seen in 2013, when a number of leaders in the field start rolling out Binding Safe Processor Rules programmes and regulators are faced with the prospect of scrutinising global cloud vendors’ data protection offerings.  Let us hope that we can use this opportunity to listen to each other’s concerns, agree a commercially realistic set of standards and get the balance right.  That would be a massive achievement.


This article was first published in Data Protection Law & Policy in December 2012.

Privacy’s greatest threat and how to overcome it

Posted on October 22nd, 2012 by

After some erroneous newspaper reports in 1897 that he had passed away, Mark Twain famously said that the reports of his death were greatly exaggerated.  The same might also be said of privacy.  Scott G. McNealy, former CEO of Sun Microsystems, reportedly once said “You already have zero privacy. Get over it.“.  However, if last week’s IAPP Privacy Academy in San Jose was anything to go by, privacy is very much alive and kicking.

It’s easy to understand why concerns about the death of privacy arise though.  Today’s data generation, processing and exploitation is simply vast – way beyond a level any of us could meaningfully hope to comprehend or, dare I suggest, control.  The real danger to privacy though is not the scale of data processing that goes on – that’s simply a reality of living in a modern day, technology-enabled, society; a Pandora’s box that, now opened, cannot now be closed.  Instead, the real danger to privacy is excessive and unrealistic regulation.

Better regulation drives better compliance

From many years of working in privacy, it’s been my experience that most businesses work hard to be compliant.  Naturally, there are outliers, but these few cases should not drive the regulation that determines how the majority conduct their business.  It’s also been my experience that compliance is most often achieved where the standards applied by legislators and regulators are accurate, proportionate and not excessive – the same standards they expect our controllers to apply when processing personal data.  In other words, legislation and regulation drives the best behaviour when it is achievable.

By contrast, excessive, disproportionate regulation that does not accurately reflect the way that technology works or recognise the societal benefits that data processing can deliver often brings about the opposite effect.  By making compliance impossible, or at least, disproportionately burdensome to achieve, businesses, unsurprisingly, often find themselves falling short of expected regulatory standards – in many cases, wholly unintentionally.

The recent “cookie law” is a good example of this: a law that, though well-intentioned, is effectively seen as regulating a technology (cookies) rather than a purpose (tracking), leading to widespread confusion about the standards that apply and – let’s be honest – non-compliance currently on an unprecedented scale throughout the EU.

Why the Regulation mustn’t make the same mistake

In its current form, the proposed General Data Protection Regulation also runs this risk.  The reform of Europe’s data protection laws is a golden, once-in-a-generation opportunity to re-visit how we do privacy and build a better, more robust framework that fosters new technologies and business innovation, while still protecting against unwarranted privacy intrusions and harm.

But instead of focussing on the “what”, the legislation focuses too much on the “how”: rather than looking to the outputs we should strive to achieve (namely, ensuring that ever-evolving technologies do not make unwarranted intrusions into our private lives) the draft legislation instead mandates excessive accountability standards that do not take proper account of context or actual likelihood of harm.

For example:

*  How, exactly, does an online business ensure that its processing of child data is predicated only on parental or guardian consent (Article 8)?  My prediction: many websites will build meaningless terms into their website privacy policies that children must not use the site – delivering no “real” protection in practice.

*  Why is it necessary for an organisation transferring data internationally to inform individuals “on the level of protection afforded by that third country … by reference to an adequacy decision of the Commission” (Article 14)? Do data subjects really care where their data goes and whether the Commission has made an adequacy decision – or do they just want assurance that their data will be used for legitimate purposes and at all times kept safe and secure, wherever it is?  How does this work in a technology environment that is increasingly shifting to the cloud?

*  Why should controllers be required to provide data portability to data subjects in an “electronic and structured format which is commonly used” (Article 18)?  Surely confidentiality and data security is best achieved through the use of proprietary systems whose technology is not “commonly used”, therefore less understood and vulnerable to external attack?  Are we legislating for a future of security weakness?

*  Why should data controllers and processors maintain such extensive levels of data processing documentation (Article 28)?  How will smaller businesses cope with this burden?  Yes, an exemption applies for businesses employing less than 250 persons but only if their data processing is “ancillary” to the main business activities – immediately ruling out most technology start-ups.

*  And how can we still, in this day and age, operate on a misguided assumption that model contracts provide a sound basis for protecting international exports of data (Article 42)?  Wouldn’t it make more sense to require controllers to make their own adequacy assessment and to hold them to account if they fall short of the mark?

Make your voice heard!

For the past 17 years, the European Union has been a standard-bearer in operating an effective legal and regulatory framework for privacy.  That framework is now showing its age and, if not reformed in a way that understands, respects and addresses the range of different (and competing) stakeholder interests, risks being ruinous to the privacy advancements Europe has achieved to date.

The good news is that reforming an entire European legal framework doesn’t happen overnight, and the process through to approval and adoption of the General Data Protection Regulation is a long one.  While formal consultation periods are now closed, there remain many opportunities to get involved in reform discussions through legislative and regulatory liaisons at both a European and national level.

To make their voices heard, businesses throughout the data processing spectrum must seize this opportunity to get involved.  Only through informed dialogue with stakeholders can Europe hope to output technology-neutral, proportionate legislation that delivers meaningful data protection in practice.  If it does this, then Europe stands the best chance of remaining a standard-bearer for privacy for the next 17 years too.