Posts Tagged ‘Enforcement’

CNIL Enforces Cookies Rules in France

Posted on July 2nd, 2015 by

On 30th June 2015, the CNIL issued a press release stating that, following its online cookies audits conducted last October (see our previous blog article), it has sent out  a formal letter of enforcement (“lettre de mise en demeure“) to approximately 20 companies requesting them to comply with the cookie rules in France. Under French law, letters of enforcement do not constitute a sanction, although the delivery of such letter is a required first step before the CNIL can pronounce an administrative sanction against a company (unless the CNIL chooses to pronounce a simple warning).

Enforcement letters are delivered by the CNIL after inspecting a company’s data processing activities, for example, by conducting on-site inspections or online audits (click here for more information about the CNIL’s enforcement powers). The Chairwoman of the CNIL may decide to make public the formal notice to comply that is served against the data controller. In the past, the CNIL has used this measure as a means to name and shame companies that either committed serious violations to the Data Protection Act and/or acted in bad faith.

The formal notice to comply must state: 1) the provisions of the Data Protection Act that the data controller has failed to comply with and 2) the period of time within which the data controller must cease such failure(s). This period may not be less than ten (10) days (except in urgent cases) and must not exceed three (3) months. If the company complies within the given period of time, the case is closed by decision of the Chairwoman of the CNIL. On the contrary, if the data controller does not comply with the notice served, the CNIL may pronounce, after due hearing of the parties, either a fine up to EUR 150,000 (or EUR 300,000 in the event of a second breach within five years or 5% of the company’s gross revenue for legal entities) or an injunction to cease the processing.

In the given context, the CNIL carried out 24 on-site inspections, 27 online inspections and 2 auditions specifically targeting how companies comply with the cookie requirements in France. The outcome of these inspections indicates that, in general, web sites do not inform their users sufficiently about the use of cookies (or other tracking technologies) and do not obtain their prior consent before setting cookies. While many companies have posted a cookie banner on their websites to inform users about the use of cookies, all the websites that were inspected by the CNIL seem to install their cookies on the user’s computer or device before obtaining the user’s consent. In France, implied consent is recognised as a valid form of consent. However, what this means in practise is that the user must actively continue to navigate the website (e.g., by clicking on a web link, an image or a search button) once having been served the cookie banner. If the user closes the web page without navigating the website, cookies cannot be served.

Furthermore, the CNIL noticed that many websites invite their users to opt-out from cookies by changing their web browser settings. However, the CNIL considers that browser settings are not a valid means for obtaining opt-out because, in their current form, they only apply to HTTP cookies and do not enable users to activate/deactivate other types of cookies such as pixels, flash cookies or fingerprinting.

Since the entrance into force of the CNIL’s new online audit powers, the CNIL has been more active in enforcing cookie rules in France. Cookie compliance continues to be a high priority on the CNIL’s enforcement agenda and companies should therefore make sure that they comply with cookie requirements in France and the rest of Europe.

For more information regarding cookie compliance requirements in Europe, you may download Fieldfisher’s Cookie Consent Table from our website, as well as the Whitepaper on “EU cookie audits: are you compliant?“, which was co-authored by Fieldfisher and TRUSTe.


By Olivier Proust, Of Counsel (

CNIL unveils its 2015 inspections plan. Are you ready for what’s coming?

Posted on May 26th, 2015 by

In 2014, I warned about the French data protection authority (“CNIL”) being a regulator to watch. One year down the road, CNIL has not failed to deliver. A few weeks ago, CNIL released its Annual Activity Report for 2014 revealing that in the past year it had conducted 421 inspections (including 58 online audits), issued 62 enforcement notices and pronounced 18 sanctions. As the current chair of the Article 29 Working Party, CNIL continues to play an active role on the European and international scene on topics such as the General Data Protection Regulation, the on-going discussions between the US and EU on Safe Harbor and the recent online sweeps organized by GPEN.

What are the CNIL’s top priorities?

The CNIL intends to conduct 550 inspections divided between 350 on-site or off-site inspections and 200 online audits. Specifically, CNIL will prioritize its actions in the following key sectors:

  • Ecommerce: following its guidance on the processing of bank card details, CNIL will focus now on payment cards with no contact (i.e., bank cards that have an integrated chip and enable cardholders to make wireless payments via “near field communication” or “NFC” technology). In particular, CNIL will verify whether adequate security measures are designed around the use of such cards and whether the financial institutions who offer these types of cards inform their customers and enable them to object to using these cards (e.g., by deactivating the integrated chip or by ordering a traditional card that is not compatible with the “NFC” technology). CNIL is also preparing for the next evolution of entirely digitalized payments by smartphone.
  • Employee privacy in the workplace: Employee privacy continues to be high on the CNIL’s agenda due to the rising number of employees who file complaints with the CNIL each year. In particular, CNIL will inspect private and public organizations who have recently conducted surveys on social-psychological risks for employees.
  • mHealth: Following the Article 29 Working Party’s opinion on mobile apps and its letter to the European Commission on the meaning of “health data” in the context of mobile apps and devices (see our previous blog), CNIL will audit interconnected objects and online services in the area of health and well-being to verify (amongst other things) whether users are provided with notice and their consent is obtained.
  • Public sector: With the French Parliament currently debating a new law to broaden the online investigation powers of the French law enforcement and national security agencies, CNIL will continue to monitor the compliance of public sector databases with the Data Protection Act. This time, CNIL will focus on the National Register for Drivers’ Licenses (“Fichier National des Permis de Conduire“) held by the Ministry of Interior, which centralises all the data about registered drivers, including fines and traffic felonies.
  • Public Wi-Fi connections: Another growing area that is receiving particular attention are publicly available Wi-Fi hotspots (such as those that are available in department stores, train stations or airports) which capture data that is being transmitted by a user’s mobile phone (e.g., type of device, MAC address, location data) and is being used more frequently to track users, to send them advertisements or offers, or to analyse their behaviour.
  • Binding Corporate Rules: Last but not least, CNIL has announced its intention to begin enforcing against companies with BCR. Since their introduction in 2003, approximately 60 organizations have had their BCR approved, but so far, no enforcement measures were taken against BCR. However, a few months ago, the lead DPAs across Europe started contacting organizations with a view to verifying and completing the information about their BCR that is posted on the European Commission’s website, thus implying that this grace period is over. The things CNIL could verify are, for example, whether a BCR policy is easily accessible on the organization’s website and whether companies have implemented the internal measures that are required for BCR compliance.

What are the CNIL’s enforcement powers?

The CNIL can carry out four types of enforcement actions, namely:

  • On-site control: the CNIL may access the buildings and premises used to process personal data, inspect the data processing applications and databases;
  • Off-site control: the CNIL may organize a hearing in its offices and require the data controller or its data protection officer to provide explanations;
  • Long distance control: the CNIL may communicate with the data controller by postal mail or email and, for example, may conduct routine surveys; and
  • On-line inspections: CNIL may conduct on-line inspections of personal data that is available on websites or mobile apps.

What sanctions can the CNIL pronounce?

If the CNIL finds that a company has failed to comply with the Data Protection Act, it can either pronounce a warning or issue a formal notice to comply within a given deadline. If the controller fails to comply with the notice served, the CNIL may then pronounce a fine up to EUR 150,000 (or EUR 300,000 in the event of a second breach within five years or 5% of the company’s gross revenue for legal entities) or an injunction to cease the processing.

Are you prepared for a CNIL inspection?

In recent years, I have assisted many companies to comply with CNIL inspections. Too often, companies are caught by surprise when the CNIL comes knocking on their door unannounced because they haven’t put in place any internal process for handling this kind of situation. As with any regulator, the dealings with the CNIL require a minimum amount of awareness and preparation.

While a CNIL inspection does not necessarily end with the CNIL pronouncing a fine or sanction against the company, inevitably this does have a disruptive effect for the company being investigated because it reveals the flaws that this company may have with regard to privacy compliance. Therefore, companies are in a better position if they tackle privacy issues at an early stage, rather than to leave it for later and risk having to fire-fight their way through a CNIL inspection.


By Olivier Proust, Of Counsel (

Data protection authorities to audit websites and apps targeting minors

Posted on May 13th, 2015 by

On May 11th, 2015, several Data Protection Authorities (“DPAs”), including the French Data Protection Authority (“CNIL”), the UK’s Information Commissioner’s Office (“ICO”) and Canada’s Office of the Privacy Commissioner, issued a press release announcing an imminent Internet Sweep Day of websites and mobile apps specifically targeting minors.

When will the Internet sweep take place?

The Internet Sweep is due to take place from May 12 to 15 this year.

Who will carry out the sweep?

This Internet Sweep is an initiative of the Global Privacy Enforcement Network (“GPEN”) and will be conducted in a combined and coordinated manner by 29 data protection authorities around the world in 20 different countries. This Internet Sweep follows previous actions that were taken by GPEN last year against websites and mobiles apps.

Each DPA taking part in the sweep will verify a number of websites available in their respective countries. For example, both the CNIL and the ICO announced that they would verify 50 websites and apps targeting a young audience.

Which websites and apps are being targeted?

The types of websites and mobile apps that will be audited are those that either target specifically minors and children, or are frequently used by individuals from this age group. For example, the CNIL announced that it would target specifically child-directed websites, such as gaming sites or mobile apps, social networking websites, educational websites and school tutoring websites.

What is the purpose of the Internet Sweep?

The purpose of the Internet Sweep is to assess whether such websites and mobile apps collect any personal data from children, and if so, what measures are put in place to protect their privacy.

The aim of the Internet Sweep is also to raise the public and businesses’ awareness on privacy-related issues concerning minors, to encourage compliance with existing privacy legislation, to identify concerns that may be addressed through targeted education or enforcement programs, and to enhance cooperation among the DPAs.

In particular, what will DPAs verify?

The DPAs will verify the types of personal data that are collected and whether the websites/ mobile apps:

– provide notice and explain the purposes for collecting personal data;

– contain privacy communications that are tailored to the age group at which they are directed (e.g., simple language, large print, audio and animation);

– raise their audience’s awareness to privacy-related issues;

– seek parental consent; and

– facilitate the deletion of personal data that is provided by children.

What is the expected outcome of the Internet sweep?

As with previous Internet Sweeps, the DPAs will use a common grid to analyse the results of the sweep. The DPAs are expected to release a report of this Internet Sweep in the Fall 2015, which will summarize the findings of the DPAs and provide a global overview of the privacy practises on websites/mobile apps, as well as specific issues in some countries.

Furthermore, the information gathered during this Internet Sweep may be used by the DPAs to conduct enforcement actions in their respective jurisdictions. The manner in which a DPA verifies compliance and carries out enforcements measures against companies (such as reaching out to a company or conducting an on-site inspection) will vary depending on the enforcement powers of each DPA under national law.

What this tells us is that DPAs are better organized and better coordinated at an international level to inspect companies. Although GPEN has no enforcement powers at a global level, it is being used increasingly as a platform by DPAs from all parts of the world to communicate amongst themselves, share information about privacy practices and coordinate their enforcement actions at a national level. This also tells us that online activities continue to be the number one priority for DPAs in terms of privacy compliance as illustrated by the increasing number of Internet Sweeps in the last two years, whether they involved websites, mobile apps or cookies.

What should companies do to remediate the risk of enforcement actions?

Therefore, companies should not wait until they are being investigated to put their house in order. Some basic steps can be taken to make sure you comply with the privacy requirements:

  • Audit your websites/apps to find out what types of personal data is being collected;
  • Make sure the data is being processed for clearly defined and limited purposes;
  • Publish a clear, understandable and accessible privacy policy on your website/mobile app that is both tailored to the audience and to the device on which it is being read; and
  • Make sure you obtain prior consent from a parent or legal guardian where a website/app is specifically targeting a young audience.

By Olivier Proust, Of Counsel (


For further information, the CNIL’s press release is available (in French) here.

The ICO’s press release is available here.

The Office of the Privacy Commissioner’s press release is available here.

Spam texts: “substantially distressing” or just annoying?

Posted on November 11th, 2014 by

The Department for Culture, Media and Sport (“DCMS”) recently launched a consultation to reduce or even remove the threshold of harm the Information Commissioner’s Office (“ICO”) needs to establish in order to fine nuisance callers, texters or emailers.


In 2010 ICO was given powers to issue Monetary Penalty Notices (“MPNs”, or fines to you and me) of up to £500,000 for those companies who breach the Data Protection Act 1998 (“DPA”).  In 2011 these were extended to cover breaches of the Privacy and Electronic Communications Regulations 2003 (“PECR”), which sought to control the scourge of nuisance calls, texts and emails.

At present the standard ICO has to establish before issuing an MPN is a high one: that there was a serious, deliberate (or reckless) contravention of the DPA or PECR which was of a kind likely to cause substantial damage or substantial distress.  Whilst unsolicited marketing calls are certainly irritating, can they really be said to cause “substantial distress”?  Getting a text from a number you didn’t know about a PPI claim is certainly annoying, but could it seriously be considered “substantial damage”?  Not exactly; and therein lies the problem.


In the first big case where ICO used this power, it issued an MPN of £300,000 to an individual who’d allegedly sent millions of spam texts for PPI claims to users who had not consented to receive them.  Upon appeal the Information Rights Tribunal overturned the fine.  The First Tier Tribunal found that whilst there was a breach of PECR (the messages were unsolicited, deliberate, with no opt-out link and for financial gain), the damage or distress caused could not be described as substantial.  Every mobile user knew what a PPI spam text meant and was unlikely to be concerned for their safety or have false expectations of compensation.  A short tut of irritation and then deleting the message solved the problem.  The Upper Tribunal agreed: a few spam texts did not substantial damage or distress cause.  Interestingly, the judge pointed out that the “substantial” requirement had come from the UK government, was stricter than that required by the relevant EU Directive and suggested the statutory test be revisited.

This does not however mean that ICO has not been able to use the power.  Since 2012 it has issued nine MPNs totalling £1.1m to direct marketers who breach PECR.  More emphasis is placed on the overall level of distress suffered by hundreds or thousands of victims, which can be considered substantial.  ICO concentrates on the worst offenders: cold callers who deliberately and repeatedly call numbers registered with the Telephone Preference Service, (“TPS” – Ofcom’s “do not call” list) even when asked to stop and those that attract hundreds of complaints.

In fact, in this particular case there were specific problems with the MPN document (this will not necessarily come as a surprise for those familiar with ICO MPNs).  The Tribunal criticised ICO for a number of reasons: not being specific about the Regulation contravened, omitting important factual information, including in the period of contravention time when ICO did not yet have fining power and changing the claim from the initial few hundred complaints to the much wider body that may have been sent.  Once all this was taken into consideration, only 270 unsolicited texts were sent to 160 people.


ICO has been very vocal about having its hands tied in this matter and has long pushed for a change in the law (which is consistent with ICO’s broader campaigning for new powers).  Nuisance calls are a cause of great irritation for the public and currently only the worst offenders can be targeted.  Statistics compiled by ICO and TPS showed that the most nuisance is caused by large numbers of companies making a smaller number of calls.  Of 982 companies that TPS received complaints about, 80% received fewer than 5 complaints and only 20 more than 25 complaints.

Following a select committee enquiry, an All Party Parliamentary Group and a backbench debate, DCMS has launched the consultation, which invites responses on whether the threshold should be lowered to “annoyance, inconvenience or anxiety“.  This would bring it in line with the threshold Ofcom must consider when fining telecoms operators for persistent misuse for silent/abandoned calls. ICO estimates that had this threshold been in place since 2012, a further 50 companies would have been investigated/fined.

The three options being considered are: to do nothing, to lower the threshold or to remove it altogether.  Both ICO and DCMS favour complete removal.  ICO would thus only need to prove a breach was serious and deliberate/reckless.


I was at a seminar last week with the Information Commissioner himself, Chris Graham, at which he announced the consultation.  It was pretty clear he is itching to get his hands on these new powers to tackle rogue callers/emailers/texters, but emphasised any new powers would still be used proportionally and in conjunction with other enforcement actions such as compliance meetings and enforcement notices.  Even the announcement of any new law should act as a deterrent: typically whenever a large MPN is announced, the number of complaints about direct marketers reduces the following month.

The consultation document is squarely aimed at unsolicited calls, texts and emails and is consistently stated to only apply to certain regulations of PECR.  There is no suggestion that the threshold be reduced for other breaches of the PECR or the DPA.  It will be interesting to see how any reform will work in practice as the actual threshold is contained within the DPA and so will require its amendment.

The consultation will run until 7 December 2014, the document can be found here.  Organisations that are concerned about these proposals now have an opportunity to make their voices heard.

Update 27 February 2015

Following the consultation, DCMS announced that the majority of responses favoured the complete removal of the threshold.  As a result, from 6 April 2015 section 55A(1) of the DPA will be amended to remove the need to prove “substantial harm or substantial distress” in respect of regulations 19 to 24 of PECR.  ICO will still need to establish that the breach was serious and intentional or reckless, however this reform removes a huge hurdle in the fight against spammers.

What does EU regulatory guidance on the Internet of Things mean in practice? Part 1

Posted on October 31st, 2014 by

The Internet of Things (IoT) is likely to be the next big thing, a disruptive technological step that will change the way in which we live and work, perhaps as fundamentally as the ‘traditional’ Internet did. No surprise then that everyone wants a slice of that pie and that there is a lot of ‘noise’ out there. This is so despite the fact that to a large extent we’re not really sure about what the term ‘Internet of Things’ means – my colleague Mark Webber explores this question in his recent blog. Whatever the IoT is or is going to become, one thing is certain: it is all about the data.

There is also no doubt that the IoT triggers challenging legal issues that businesses, lawyers, legislators and regulators need to get their heads around in the months and years to come. Mark discusses these challenges in the second part of his blog (here), where he considers the regulatory outlook and briefly discusses the recent Article 29 Working Party Opinion on the Internet of Things.

Shortly after the WP29 Opinion was published, Data Protection and Privacy Commissioners from Europe and elsewhere in the world adopted the Mauritius Declaration on the Internet of Things. It is aligned to the WP29 Opinion, so it seems that privacy regulators are forming a united front on privacy in the IoT. This is consistent with their drive towards closer international cooperation – see for instance the latest Resolution on Enforcement Cooperation and the Global Cross Border Enforcement Cooperation Agreement (here).

The regulatory mind-set

You only need to read the first few lines of the Opinion and the Declaration to get a sense of the regulatory mind-set: the IoT can reveal ‘intimate details’; ‘sensor data is high in quantity, quality and sensitivity’ and the inferences that can be drawn from this data are ‘much bigger and sensitive’, especially when the IoT is seen alongside other technological trends such as cloud computing and big data analytics. The challenges are ‘huge’, ‘some new, some more traditional, but then amplified with regard to the exponential increase of data processing’, and include ‘data losses, infection by malware, but also unauthorized access to personal data, intrusive use of wearable devices or unlawful surveillance’.

In other words, in the minds of privacy regulators, it does not get much more intrusive (and potentially unlawful) than this, and if the IoT is left unchecked, it is the quickest way to an Orwellian dystopia. Not a surprise then that the WP29 supports the incorporation of the highest possible guarantees, with users remaining in complete control of their personal data, which is best achieved by obtaining fully informed consent. The Mauritius Declaration echoes these expectations.

What the regulators say

Here are the main highlights from the WP29 Opinion:

  1. Anyone who uses an IoT object, device, phone or computer situated in the EU to collect personal data is captured by EU data protection law. No surprises here.
  2. Data that originates from networked ‘things’ is personal data, potentially even if it is pseudonymised or anonymised (!), and even if it does not relate to individuals but rather relates to their environment. In other words, pretty much all IoT data should be treated as personal data.
  3. All actors who are involved in the IoT or process IoT data (including device manufacturers, social platforms, third party app developers, other third parties and IoT data platforms) are, or at least are likely to be, data controllers, i.e. responsible for compliance with EU data protection law.
  4. Device manufacturers are singled out as having to take more practical steps than other actors to ensure data protection compliance (see below). Presumably, this is because they have a direct relationship with the end user and are able to collect ‘more’ data than other actors.
  5. Consent is the first legal basis that should be principally relied on in the IoT. In addition to the usual requirements (specific, informed, freely given and freely revocable), end users should be enabled to provide (or withdraw) granular consent: for all data collected by a specific thing; for specific data collected by anything; and for a specific data processing. However, in practice it is difficult to obtain informed consent, because it is difficult to provide sufficient notice in the IoT.
  6. Controllers are unlikely to be able to process IoT data on the basis that it is on their legitimate interests to do so, because it is clear that this processing significantly affects the privacy rights of individuals. In other words, in the IoT there is a strong regulatory presumption against the legitimate interests ground and in favour of consent as the legitimate basis of processing.
  7. IoT devices constitute ‘terminal devices’ for EU law purposes, which means that any storage of information, or access to information stored, on an IoT device requires the end user’s consent (note: the requirement applies to any information, not just personal data).
  8. Transparency is absolutely essential to ensure that the processing is fair and that consent is valid. There are specific concerns around transparency in the IoT, for instance in relation to providing notice to individuals who are not the end users of a device (e.g. providing notice to a passer-by whose photo is taken by a smart watch).
  9. The right of individuals to access their data extends not only to data that is displayed to them (e.g. data about calories burnt that is displayed on a mobile app), but also the raw data processed in the background to provide the service (e.g. the biometric data collected by a wristband to calculate the calories burnt).
  10. There are additional specific concerns and corresponding expectations around purpose limitation, data minimisation, data retention, security and enabling data subjects to exercise their rights.


It is also worth noting that some of the expectations set out in the Opinion do not currently have an express statutory footing, but rather reflect provisions of the draft EU Data Protection Regulation (which may or may not become law): privacy impact assessments, privacy by design, privacy by default, security by design and the right to data portability feature prominently in the WP29 Opinion.

The regulators’ recommendations

The WP29 makes recommendations regarding what IoT stakeholders should do in practice to comply with EU data protection law. The highlights include:

  1. All actors who are involved in the IoT or process IoT data as controllers should, carry out Privacy Impact Assessments and implement Privacy by Design and Privacy by Default solutions; should delete raw data as soon as they have extracted the data they require; and should empower users to be in control in accordance with the ‘principle of self-determination of data’.
  2. In addition, device manufacturers should:
    1. follow a security by design principle;
    2. obtain consents that are granular (see above), and the granularity should extend to enabling users to determine the time and frequency of data collection;
    3. notify other actors in the IoT supply chain as soon as a data subject withdraws their consent or opposes a data processing activity;
    4. limit device finger printing to prevent location tracking;
    5. aggregate data locally on the devices to limit the amount of data leaving the device;
    6. provide users with tools to locally read, edit and modify data before it is shared with other parties;
    7. provide interfaces to allow users to extract aggregated and raw data in a structured and commonly used format; and
    8. enable privacy proxies that inform users about what data is collected, and facilitate local storage and processing without transmitting data to the manufacturer.
  3. The Opinion sets out additional specific expectations for app developers, social platforms, data platforms, IoT device owners and additional data recipients.



I have no doubt that there are genuinely good intentions behind the WP29 Opinion and the Mauritius Declaration. What I am not sure about is whether the approach of the regulators will encourage behaviours that protect privacy without stifling innovation and impeding the development of the IoT. I am not even sure if, despite the good intentions, in the end the Opinion will encourage ‘better’ privacy protections in the IoT. I explain why I have these concerns and how I think organisations should be approaching privacy compliance in the IoT in Part 2 of this piece.

Global apps sweep: should developers be worried?

Posted on October 24th, 2014 by

A recent sweep with participation from 26 data protection authorities (“DPA”) across the world revealed a high proportion of mobile apps are accessing large amounts of personal data without meeting their data privacy obligations.

How did they do?

Not well. Out of the 1,200 apps surveyed, 85% failed to clearly explain how they were collecting and using personal information, 59% did not display basic privacy information and one in three requested excessive personal information. Another common finding was that many apps fail to tailor privacy information to the small screen.

The Information Commissioner’s Office (“ICO”), as the UK’s DPA, surveyed 50 apps including many household names which produced results in line with the global figures.

Rare examples of good practice included pop-up notifications asking permission prior to additional data being collected and basic privacy information with links to more detailed information for users who wish to know more.

What are they told to do?

As a result, Ofcom and ICO have produced some guidance for users on how to use apps safely. It is written in consumer-friendly language and contains straightforward advice on some common pitfalls such as checking content ratings or always logging out of banking apps.

Contrast this with the 25 page guidance from ICO aimed at app developers, which has drawn some criticism for being overly lengthy and complex. Rather than participate in research and point to long guidance documents, it would be more effective to promote simple rules (eg. requiring pop-up notifications) and holding app stores accountable for non-compliance. However, site tests demonstrate users are irritated enough by constant pop-ups to stop using the site, so developers are reluctant to implement them.

Why is it important?

The lack of compliance is all the more alarming if read in conjunction with Ofcom research which surveyed a range of UK app users. Users view the apps environment as a safer, more contained space than browser-based internet access. Many believed apps are discrete pieces of software with little interconnectivity, were unaware of the virus threat or that apps can continue to run in the background. There is implicit trust in established brands and recognised app stores, who users felt must monitor and vet all apps before selling them. Peer recommendation also played a significant role in deciding whether to download an app.

This means little, if any, attention is paid to privacy policies and permission requests. Users interviewed generally felt full permission had to be granted prior to using the app and were frustrated by the lack of ability to accept some and refuse other permissions.

So what’s the risk for developers?

ICO has the power to fine those companies who breach the relevant laws up to £500,000. The threshold for issuing a fine is high, however, and this power has not yet been used in the context of mobile apps. Having said this, we know that ‘internet and mobile’ are one of ICO’s priority areas for enforcement action.

Perhaps a more realistic and potentially more damaging risk is the reputational and brand damage associated with being named and shamed publically. ICO is more likely, when a lower level of harm has been caused, to seek undertakings that the offending company will change its practices. As we know, ICO publishes its enforcement actions on its website. For a company whose business model relies on processing data and on peer recommendations as the main way to grow its user base and its brand, the trust of its users is paramount and hard to rebuild once lost.

ICO has said it will be contacting app developers who need to improve their data collection and processing practices. The next stage for persistent offenders would be enforcement action.

Developers would be wise to pay attention. If enforcement action is not in itself a concern, ICO’s research showed almost half app users have decided against downloading an app due to privacy concerns. If that’s correct, privacy is important to mobile app users and could ‘make’ or ‘break’ a new app.

Update 12 December 2014

The 26 DPAs who took part in the global sweep have since written to seven major app stores including those of Apple, Google and Microsoft.  In their letter of 9 December they urge the marketplaces to make links to privacy policies mandatory, rather than optional, for those apps that collect personal data.

The EU “cookies sweep day” and national cookie audits

Posted on September 22nd, 2014 by

Cookies have recently become a hot topic again, following a press release by the French Data Protection Authority (CNIL) on July 11th, 2014, announcing a EU “cookies sweep day” and enforcement actions in France. Here’s an update on what has happened and what to expect.

1. EU Cookies Sweep Day: 15 – 19 September

When did the EU “cookies sweep data” take place?

From 15 to 19 September, the Article 29 Working Party (“WP29”) conducted a coordinated online audit of the main websites operating in Europe to verify compliance with the EU cookie requirements. The CNIL and other Data Protection Authorities (“DPAs”) spent a couple of days assessing the level of compliance on some of the most visited websites.

Did the “cookies sweep day” concern all websites?

No, the EU “cookies sweep day” only concerned websites that are targeting European consumers. Potentially any website (operated either within or outside the EU) that uses cookies or other tracking technologies to collect personal data from users in Europe may have been audited. Websites that do not provide services to European consumers, or that do not collect personal data via cookies from Europeans users, were normally not concerned. According to the CNIL, the main sectors to have been audited were e-commerce platforms and media websites.

Where did the “cookies sweep day” take place?

The EU “cookies sweep day” was an initiative of the WP29, and any DPA could take part in it. Therefore, potentially any website available in the European Union may have been audited.

How many websites were audited?

The WP29 did not release any official number of websites that were audited. However, the CNIL announced that it had audited 100 websites.

What did the DPAs verify?

The EU “cookies sweep day” offered an opportunity for all DPAs to verify together whether websites comply with the EU cookie requirements (namely the notice and consent rules) and to produce a comparative review of their practices with regard to cookies. In particular, the DPAs verified the number and types of cookies use, the manner in which users are informed about the use of cookies, and the process for obtaining consent.

What is the outcome of the “cookies sweep day”?

The DPAs will share the results of their respective audits with a view to comparing these results among Member States and possibly harmonising their positions with regard to cookies compliance in Europe. Furthermore, it is likely the WP29 will release a public statement about the results of the “cookies sweep day” in the near future.

Is there a risk that non compliant companies may be sanctioned?

The purpose of the EU “cookies sweep day” was not to conduct enforcement actions. However, the results of the audits may be used by each DPA to enforce compliance with the cookie provisions under national law. Some data protection authorities have already begun enforcing cookie rules in their respective jurisdictions (see our previous blog).

For more information about the EU Cookies Sweep Day, click here.


2. Cookie audits in France: October 2014

In its July 2014 press release, the CNIL also announced that it would audit websites in France to verify compliance with French cookie provisions. Last year, the CNIL issued guidance on how to comply with cookie requirements in France (published in December 2013) and the CNIL now expects companies to be compliant. This enforcement program will enable the CNIL to test its new on-line investigatory powers that came into force following a revision of the French Data Protection Act in March 2014 (see our previous blog). This is in line with the CNIL’s inspections plan published earlier this year, which announced at least 200 online inspections.

What will the CNIL verify?

The CNIL will focus its investigation on:

  • The types of cookies and other tracking technologies that are used (e.g., HTTP, local shared objects (flash cookies), finger printing, etc.)
  • The purposes of the cookies used and whether the owner of the website knows and understands the purposes of all the cookies (including third party cookies) used on his website.

Furthermore, where prior consent is required, the CNIL will verify:

  • The method used to obtain consent from the user
  • The quality, accessibility and clarity of the information provided to users
  • The consequences of a refusal from the user to use cookies. As an example, the CNIL refers to users of a e-commerce website whose only option is to refuse all cookies via the cookie settings of their web browser. As a result, such users may not be able to use the website at all.
  • The possibility to withdraw user consent at any time
  • The duration of cookies.

What are the risks for companies?

In France, the CNIL has the power to conduct on-site and on-line inspections that can be followed by administrative sanctions. In particular, the CNIL can issue a public warning or an enforcement notice asking the company to comply within a given period of time. If the company fails to comply with the terms of this notice, the CNIL may then initiate administrative proceedings which ultimately can lead to a fine or an obligation to cease the processing.

What should companies do in advance of this enforcement action?

Cookie compliance is still very much a hot topic in Europe, with different countries amending their laws and DPAs issuing guidance or conducting enforcement actions. Therefore, companies should not wait until they are being investigated to put their house in order. Some basic steps can be taken to make sure you comply with the cookie requirements:

  • Audit your websites to find out what types of cookies (or other tracking devices) you use
  • Analyse the purposes of the cookies
  • Assess the level of intrusiveness of cookies and verify which cookies require prior consent
  • Publish a clear, understandable and accessible cookie policy on your website
  • Implement an adequate cookie consent mechanism

For more information on cookie audits in France, the CNIL’s press release is available (in French) here.

For more information about cookie consent requirements in Europe, click here.

CNIL: a regulator to watch in 2014

Posted on March 18th, 2014 by

Over the years, the number of on-site inspections by the French DPA (CNIL) has been on a constant rise. Based on the CNIL’s latest statistics (see CNIL’s 2013 Annual Activity Report), 458 on-site inspections were carried out in 2012, which represents a 19 percent increase compared with 2011. The number of complaints has also risen to 6,000 in 2012, most of which were in relation to telecom/Internet services, at 31 percent. In 2012, the CNIL served 43 formal notices asking data controllers to comply. In total, the CNIL pronounced 13 sanctions, eight of which were made public. In the majority of cases, the sanction pronounced was a simple warning (56 percent), while fines were pronounced in only 25 percent of the cases.

The beginning of 2014 was marked by a landmark decision of the CNIL. On January 3, 2014, the CNIL pronounced a record fine against Google of €150,000 ($204,000) on the grounds that the terms of use available on its website since March 1, 2012, allegedly did not comply with the French Data Protection Act. Google was also required to publish this sanction on the homepage of within eight days of it being pronounced. Google appealed this decision, however, on February 7th, 2014, the State Council (“Conseil d’Etat”) rejected Google’s claim to suspend the publication order.

Several lessons can be learnt from the CNIL’s decision. First, that the CNIL is politically motivated to hit hard on the Internet giants, especially those who claim that their activities do not fall within the remit of the French law. No, says the CNIL. Your activities target French consumers, and thus, you must comply with the French Data Protection Act even if you are based outside the EU. This debate has been going on for years and was recently discussed in Brussels within the EU Council of Ministers’ meeting in the context of the proposal for a Data Protection Regulation. As a result, Article 4 of the Directive 95/46/EC could soon be amended to allow for a broader application of European data protection laws to data controllers located outside the EU.

Second, despite it being the highest sanction ever pronounced by the CNIL, this is hardly a dissuasive financial sanction against a global business with large revenues. Currently, the CNIL cannot pronounce sanctions above €150,000 or €300,000 ($410,000) in case of a second breach within five years from the first sanction pronounced, whereas some of its counterparts in other EU countries can pronounce much heavier sanctions; e.g., last December, the Spanish DPA pronounced a €900,000 ($1,230,000) fine against Google. This could soon change, however, in light of an announcement made by the French government that it intends to introduce this year a bill on “the protection of digital rights and freedoms,” which could significantly increase the CNIL’s enforcement powers.

Furthermore, it seems that the CNIL’s lobbying efforts within the French Parliament are finally beginning to pay off. A new law on consumer rights came into force on 17 March 2014, which amends the Data Protection Act and grants the CNIL new powers to conduct online inspections in addition to the existing on-site inspections. This provision gives the CNIL the right, via an electronic communication service to the public, “to consult any data that are freely accessible, or rendered accessible, including by imprudence, negligence or by a third party’s action, if required, by accessing and by remaining within automatic data protection systems for as long as necessary to conduct its observations.” This new provision opens up the CNIL’s enforcement powers to the digital world and, in particular, gives it stronger powers to inspect the activities of major Internet companies. The CNIL says that this law will allow it to verify online security breaches, privacy policies and consent mechanisms in the field of direct marketing.

Finally, the Google case is a good example of the EU DPAs’ recent efforts to conduct coordinated cross-border enforcement actions against multinational organizations. In the beginning of 2013, a working group was set up in Paris, led by the CNIL, for a simultaneous and coordinated enforcement action against Google in several EU countries. As a result, Google was inspected and sanctioned in multiple jurisdictions, including Spain and The Netherlands. Google is appealing these sanctions.

As the years pass by, the CNIL continues to grow and to become more resourceful. It is also more experienced and better organized. The CNIL is already very influential within the Article 29 Working Party, as recently illustrated by the Google case, and Isabelle Falque-Pierrotin, the chairwoman of the CNIL, was recently elected chair of the Article 29 Working Party. Thus, companies should pay close attention to the actions of the CNIL as it becomes a more powerful authority in France and within the European Union.

This article was first published in the IAPP’s Privacy Tracker on 27 February 2014 and was updated on 18th March 2014.

UK e-privacy enforcement ramps up

Posted on April 29th, 2013 by

The times when one could say that the UK ICO was a fluffy, teethless regulator are over. Recently, the ICO has been going through its most prolific period of enforcement activity – by the end of 2012 it had imposed 25 fines, issued 3 enforcement notices, secured 6 prosecutions and obtained 31 undertakings and 2013 looks set to bring similar activities (in March for example the ICO issued its first monetary penalty for a serious breach of the Privacy and Electronic Communications Regulations 2003 (‘PECR’) relating to live marketing calls – a £90,000 fine for Glasgow-based DM Design for unwanted marketing calls.

To coincide with such activities, the ICO has recently updated the enforcement section of its website. What this tells us is that whilst data security breaches will continue to be a significant area of focus for the ICO, PECR breaches will also figure highly in the ICO’s enforcement agenda. In this regard, the ICO tell us that it has already been active in the areas of ‘spam texts’, sales calls and cookies.

Spam texts are identified as ‘one of the biggest concerns to consumers’ (the ICO refers to texts about accident and ‘PPI’ claims, in particular) and refers to the work it has carried out with members of the mobile phone industry in order to identify an organisation which is now the subject of enforcement action. The ICO also identifes ‘Live’ Sales Calls and ‘Automated Calls’ as other areas of priority, and have explicitly identified (and published) the names of a number of companies where they have either met to discuss compliance issues; or indeed are in the process of activeley monitoring ‘concerns’ about compliance with a view to considering enforcement action. This is not only related to UK-based companies, but also those based overseas who are targeting UK-based consumers. The ICO tell us that they are actively working with the FTC in the US and with other regulators based in Ireland, Belgium and Spain through Consumer Protection Co-operation arrangements.

Finally the ICO tells us that between January and March 2013 it received a further 87 reported concerns via its website from individuals about cookies (many less than the amount of concerns about unwanted marketing communications from individuals, it has to be said). The ICO will continue to focus on those websites that are doing nothing to raise awareness of cookies or obtain users’ consent, and also on those sites they receive complaints about or are ‘visited most by consumers’. However the ICO also say that they have ‘maintained a consumer threat level of ‘low’ in this area due to the low level of concerns reported’.

It is obvious that as consumer technologies such as tablets and smart-phones continue to develop, so too will the ICO’s enforcement strategy in this area. Compliance with PECR should therefore also figure highly on any business’s data protection compliance strategy.

CNIL unveils 2012 annual activity report

Posted on April 29th, 2013 by

On April 23rd, 2013, the French data protection authority (the “CNIL”) unveiled its 2012 Annual Activity Report (the “Report”). The CNIL’s Report gives an overview of the actions and initiatives undertaken in the past year, and is also a good indicator for what to expect in the coming year.

The CNIL has adopted a three-year strategic orientation program for the period 2012-2015. This action plan sets out three priorities, namely:

– To adopt a policy of openness and consultation towards stakeholders ;
– To raise the level of awareness among data controllers (particularly companies) and to help them develop tools that allow them to implement the data protection principles; and
– To increase the level of compliance through a more targeted and efficient enforcement policy.

Focusing on the CNIL’s enforcement strategy, the summary below highlights some of the key points in the CNIL’s Report:

– Complaints: The number of complaints has risen to 6000 in 2012. 46% of complaints concerned the right to object to the data processing. The constant rise of complaints over the past years indicates that citizens are more and more aware of their data protection rights and are taking action more frequently. The telecoms/internet sector appears to have triggered most of the complaints (31%).

– Inspections: The CNIL conducted 458 on-site inspections in 2012, which represents a 19% increase compared to 2011. 285 of the inspections were carried out in the context of the Data Protection Act, while 173 inspections concerned the use of videosurveillance equipment. With regard to the Data Protection Act, 23% of the inspections were triggered by complaints and another 26% were initiated by events picked up in the news. This shows that the CNIL often takes action when a particular event or situation makes the headlines. 40% of the inspections are in line with the priorities set out by the CNIL in its annual inspection’s plan, which shows some consistency in how the CNIL operates within a particular sector or business activity.

– Sanctions: In 2012, the CNIL served 43 formal notices asking data controllers to comply. In most of the cases, the CNIL did not pronounce any sanction because the data controller had complied. In total, the CNIL pronounced 13 sanctions, eight of which were made public. The publicity of the sanction follows a recent amendment of the Data Protection Act, which authorizes the CNIL to publish the sanction it pronounces. In the majority of cases, the sanction pronounced was a simple warning (56%), while fines were pronounced in only 25% of the cases. The CNIL pronounced only one injunction to cease the processing. The low number of fines can be explained by the fact they do not have a very deterrent effect for companies in France (by law, the maximum fine for a first violation is EUR 150,000). On the contrary, a warning can cause serious reputational damage to the data controller, particularly when it is made public, which may explain why the CNIL has chosen to publish its sanctions in 60% of the cases.

– Videosurveillance: In 2012, the CNIL carried out over 170 inspections of videosurveillance systems. In this context, the CNIL received more than 300 complaints, 75% of which concerned the use of video cameras at the workplace. The CNIL notes a lack of clarity surrounding the current legal framework for videosurveillance measures, the insufficient or inexistent information of individuals, the inappropriate use of cameras, and insufficient security measures. In 2012, the CNIL published six practical guidebooks, explaining how to use video cameras in compliance with the law.

– Data breach notifications: Following the implementation of the revised ePrivacy directive into French law, the CNIL received the first notifications for data breaches in the telecoms sector. While the total number of notifications for 2012 remains fairly low, the CNIL expects to receive more notifications in the coming year.

It is also worth noting that the CNIL’s budget and manpower have also increased in 2012. As the years pass by, the CNIL continues to grow and to become more resourceful. It is also more experienced and better organized. Thus, data controllers should pay close attention to the actions of the CNIL as it becomes a most powerful authority in France and within the European Union.

The CNIL’s 2012 Annual Activity Report is available (in French) at