Archive for the ‘Uncategorized’ Category

Belgian Data Protection Day to focus on “privacy in the workplace”

Posted on October 16th, 2014 by



Two years ago, the Belgian Data Protection authority (“Privacy Commission”) published its guidelines on the monitoring of e-mail and internet usage in the workplace. Its aim was to clarify previous recommendations and highlight any potential conflicts of law between the Collective Bargaining Agreement n° 81 on the monitoring of electronic communications of employees and various other laws prohibiting unlawful monitoring (see our previous blog post at http://www.fieldfisher.com/publications/2011/08/belgian-privacy-commission-clarifies-employee-monitoring#sthash.oScWaq6x.dpbs).

While these recommendations addressed a lot of the common issues faced by businesses, they still did not provide answers to all the questions posed. A recurring problem faced by employers is whether or not they can have access to an employee’s e-mail account. In certain circumstances, be it in the context of a dismissal or in the case of harassment for example, an employer may require access to ensure the continuity of services to clients or to investigate the allegations that have been made.

The most frequently asked questions in these situations are:

  • If suspicious behaviour is identified, what level of forensic examination is justified?
  • Should the rules to be interpreted differently depending on whether the person behaving suspiciously is still employed by the investigating organisation or not?
  • How should the “reasonable expectations of privacy” doctrine be interpreted and applied?

It is clear this is a hot topic in Belgium and remains very divisive.

Accordingly, the Privacy Commission intends to devote its entire Data Protection Day on January 28th 2015 to “privacy in the workplace”.

On the same day, the Privacy Commission will also be providing additional guidance on this topic, with the aim of educating employers and employees about the necessity (and importance) of the protection of privacy and personal data in the workplace.

The Privacy Commission has invited Fieldfisher (as part of a select group of firms) to share our experience and knowledge of this topic with the delegates. If you have specific concerns, observations or comments which you would like to see addressed by the Privacy Commission, feel free to contact tim.vancanneyt@fieldfisher.com or aagje.degraeve@fieldfisher.com by early November.

New Belgian government increases focus on privacy

Posted on October 15th, 2014 by



More than four months after the federal elections, the new Belgian government has finally been sworn in. In many respects, this centre-right government is unique and unlike any previous administration. For example, it’s the first government in 25 years not to feature a socialist party and the first coalition ever with only one French-speaking party. In this post, I’ll be discussing its unprecedented focus on privacy.

In addition to specific chapters on the protection of Privacy and on cyber security, the Coalition Agreement contains numerous other references to Privacy throughout. What’s more, for the first time ever, a ‘secretary of state’ for Privacy has been appointed (this is a member of the cabinet who is assigned and reports to one of the ministers).

So what does this mean in practice? As always the text of the coalition agreement remains vague. However, the salient points are:

Increased privacy protection versus increased data mining

Interestingly, reference to privacy protection is mostly made in relation to government decisions to increase data mining and profiling. The government intends to extend the police’s powers for monitoring CCTV; increase the Judiciary Service and National Security’s powers to proactively investigate crimes; better monitor asylum seekers and enable cross-access to different public databases.

On the one hand it is obviously positive that the new government wants to reinforce the protection of individual’s privacy. However, reading between the lines, it seems the focus on privacy is also somewhat misleading. Indeed, the increased monitoring resulting from many of the new governmental measures risks turning the authorities into an even more powerful Big Brother than before.

Modernisation of the Belgian Data Protection Act versus need for robust harmonised EU privacy laws

The coalition agreement states (in a rather non-committal fashion) that, ‘where necessary’, the Belgian Data Protection Act will be modernised. In the absence of further detail, it currently remains unclear what this will mean in practice. It is unfortunate however to note that the new government states that as a rule, data processing should rely on informed consent. If the Belgian Data Protection Act were to be modified to reflect this, it would make businesses’ life more difficult without really improving the level of protection of data subjects. Some may fear that it would only increase the so-called ‘consent fatigue’.

More generally, one could ask why the government has chosen to update the Belgian Data Protection Act when the Data Protection Regulation on its way. While the coalition agreement still expressly states the need for solid harmonised EU privacy laws, it makes you wonder whether the new government is of the opinion that the Data Protection Regulation will not be adopted anywhere soon or even at all?

Privacy Commission will be reformed

The Belgian data protection authority also known as Privacy Commission, will be reformed by increasing the independence of its members, especially in the light of the new e-government initiatives that will be implemented. The coalition agreement remains silent as to whether the Privacy Commission will also be vested with fining powers. It does refer to the fact that appropriate sanctions must be applied in cases of infringement of data protection laws. This might suggest that in addition to the criminal sanctions, which are currently hardly ever applied, administrative sanctions will be adopted. In this context, it should be noted that the Privacy Commission itself requested for such powers earlier this year. We would therefore not be surprised if the Privacy Commission is granted more robust enforcement powers.

Cyber security high on the agenda

Since the Snowden revelations, cyber security remains high on the agenda in Belgium. The issues faced in resolving the cyber attack on Belgium’s incumbent and majority state-owned Telco – Belgacom and the hacking of the Ministry of Foreign Affairs’ systems demonstrated how Belgium was relatively unprepared for this type of situation. It therefore comes as no surprise that the new government will be  investing in the recently created Belgian Centre for Cyber Security. This centre will act as a coordinator and must provide advice to public authorities and take initiatives to help protect businesses and the general public. This is considered absolutely essential when it comes to increasing trust in the ‘digital society’ and allowing Belgium to play a leading role in terms of e-government, digital society and the internet of things.

Conclusion

The increased focus on privacy and data protection is of course to be applauded. The real question remains however whether this coalition agreement will result in any real, tangible improvements, or whether it is simply window-dressing. Much will depend on the priority given to these topics by the secretary of state, who is also in charge of Social Fraud and the North Sea and how he will translate these high-level principles into actual legal texts.

In this context it is noteworthy that a roundtable is going to be organised with all stakeholders to refine and apply the high-level principles of the coalition agreement. We will report back on this as soon as more details become available.

Part 1: Cutting through the Internet of Things hyperbole

Posted on October 15th, 2014 by



I’ve held back writing anything about the Internet of Things (or “IoT“) because there are so many developments playing out in the market. Not to mention so much “noise”.

Then something happened: “It’s Official: The Internet Of Things Takes Over Big Data As The Most Hyped Technology” read a Forbes headline. “Big data”, last week’s darling, is condemned to the “Trough of Disillusionment” while Gartner moves IoT to the very top of its 2014 emerging technologies Hype Cycle. Something had to be said.

The key point for me is that the IoT is “emerging”. What’s more, few are entirely sure where they are on this uncharted journey of adoption. IoT has reached an inflexion point and a point where businesses and others realise that identifying with the Internet of Things may drive sales, shareholder value or merely kudos. We all want a piece of this pie.

In Part 1 of this two part exploration of IoT, I explore what the Internet of Things actually is.

IoT –what is it?

Applying Gartner’s parlance, one thing is clear; when any tech theme hits the “Peak of Expectations” the “Trough of Disillusionment” will follow because, as with any emerging technology, it will be sometime until there is pervasive adoption of IoT. In fact, for IoT, Gartner says widespread adoption could be 5 to 10 years away. However, this inflexion point is typically the moment in time when the tech industry’s big guns ride into town and, just as with cloud (remember some folk trying to trade mark the word?!), this will only drive further development and adoption. But also further hype.

The world of machine to machine (“M2M“) communications involved the connection of different devices which previously did not have the ability to communicate. For many, the Internet of Things is something more, as Ofcom (the UK’s communications regulator) set out in its UK consultation, IoT is a broader term, “describing the interconnection of multiple M2M applications, often enabling the exchange of data across multiple industry sectors“.

The Internet of Things will be the world’s most massive device market and save companies billions of dollars” shouted Business Week in October 2014, happy to maintain the hype but also acknowledging in its opening paragraph that IoT is “beginning to grow significantly“. No question, IoT is set to enable large numbers of previously unconnected devices to connect and then communicate sharing data with one another. Today we are mainly contemplating rather than experiencing this future.

But what actually is it?

The emergence of IoT is driving some great debate. When assessing what IoT is and what it means for business models, the law and for commerce generally, arguably there are more questions than there are answers. In an exploratory piece in ZDNET Richie Etwaru called out a few of these unanswered questions and prompted some useful debate and feedback. The top three questions raised by Ritchie were:

  • How will things be identified? – believing we have to get to a point where there are standards for things to be sensed and connected;
  • What will the word trust mean to “things” in IoT? – making the point we need to redefine trust in edge computing; and
  • How will connectivity work? – Is there something like IoTML (The Internet of Things Markup Language) to enable trust and facilitate this communication?

None of these questions are new, but his piece reinforces that we don’t quite know what IoT is and how some of its technical questions will be addressed. It’s likely that standardisation or industry practice and adoption around certain protocols and practices will answer some of these questions in due course. As a matter of public policy we may see law makers intervene to shape some of these standards or drive particular kinds of adoption. There will be multiple answers to the “what is IoT?” question for some time. I suspect in time different flavours and business models will come to the fore. Remember when every cloud seminar spent the first 15 minute defining cloud models and reiterating extrapolations for the future size of the cloud market? Brace yourselves!

I’ve been making the same points about “cloud” for the past 5 years – like cloud the IoT is a fungible concept. So, as with cloud, don’t assume IoT has definitive meaning. As with cloud, don’t expect there is any specific Internet of Things law (yet?). As Part 2 of this piece will discuss, law makers have spotted there’s something new which may need regulatory intervention to cultivate it for the good of all but they’ve also realised  that there’s something which may grow with negative consequences – something that may need to be brought into check. Privacy concerns particularly have raised their head early and we’ve seen early EU guidance in an opinion from the Article 29 Working Party, but there is still no specific IoT law. How can there be when there is still little definition?

Realities of a converged world

For some time we’ve been excited about the convergence of people, business and things. Gartner reminds us that “[t]he Internet of Things and the concept of blurring the physical and virtual worlds are strong concepts in this stage. Physical assets become digitalized and become equal actors in the business value chain alongside already-digital entities“.   In other words; a land of opportunity but an ill-defined “blur” of technology and what is real and merely conceptual within our digital age.

Of course the IoT world is also a world bumping up against connectivity, the cloud and mobility. Of course there are instances of IoT out there today. Or are there? As with anything that’s emerging the terminology and definition of the Internet of Things is emerging too. Yes there is a pervasiveness of devices, yes some of these devices connect and communicate, and yes devices that were not necessarily designed to interact are communicating, but are these examples of the Internet of Things? Break these models down into constituent parts for applied legal thought and does it necessarily matter?

Philosophical, but for a reason

My point? As with any complex technological evolution, as lawyers we cannot apply laws, negotiate contracts or assess risk or the consequences for privacy without a proper understanding of the complex ecosystem we’re applying these concepts to. Privacy consequences cannot be assessed in isolation and without considering how the devices, technology and data actually interact. Be aware that the IoT badge means nothing legally and probably conveys little factual information around “how” something works. It’s important to ask questions. Important not to assume.

In Part 2 of this piece I will discuss some early signs of how the law may be preparing to deal with all these emerging trends? Of course the answer is that it probably already does and it probably has the flexibility to deal with many elements of IoT yet to emerge.

Event alert: The new mechanism for international data transfers – APEC’s CBPRs demystified

Posted on October 2nd, 2014 by



On Friday 3 October, Fieldfisher will host an afternoon event entitled “The new mechanism for international data transfers – APEC’s CBPRs demystified” at our new offices in London.

The event is designed to demonstrate how Cross Border Privacy Rules (“CBPRs”) and Binding Corporate Rules (“BCRs”) can be utilised to facilitate global data protection compliance. Hazel Grant, Fieldfisher’s new Head of Privacy, will chair the event which will also feature presentations from Anick Fortin-Cousens of IBM and Myriam Gufflet of the French data protection regulator (“CNIL”).

  • Ms Fortin-Cousens, leader of IBM’s Corporate Privacy Office and IBM’s CPO for Canada, Latin America and MEA, will provide a practical insight into CBPR. Earlier this year IBM became the first organisation to obtain Asia-Pacific Economic Cooperation’s (“APEC”) CBPR certification.
  • Ms Gufflet, BCR Division Manager at the CNIL, will tell us about the potential interoperability between BCRs and CBPRs. The CNIL have been closely involved in the work of the joint EU-APEC committee on this topic and was appointed as the Article 29 Working Party’s rapporteur in this matter.

This event is aimed at legal counsel and privacy/compliance professionals in organizations with a global reach who would be interested in understanding how CBPR certification may improve their organization’s data protection global compliance

Networking drinks will follow the event and will allow attendees to meet privacy, e-commerce and technology law experts from a number of European countries (Austria, Belgium, Czech Republic, Denmark, Finland, France, Germany Hungary, Italy, the Netherlands, Poland, Portugal, Spain, Sweden, Switzerland and the UK) who form the Ecomlex network (www.ecomlex.com)

A limited number of places remain available for the event. If you would like to attend, please register your interest by clicking on the following linkhttp://www.fieldfisher.com/events/2014/10/the-new-mechanism-for-international-data-transfers-–-apecs-cbprs-demystified#sthash.978Xa0wU.dpbs

 

German Federal Court further strengthens review platforms

Posted on September 24th, 2014 by



With ever increasing relevance of online review platforms, the discussion about the platform´s red lines becomes more and more heated in Germany. The Federal Court of Justice now issued its second decision in this area within only a couple of months. This time, a medical practitioner demanded his profile to be completely deleted on a review platform focusing on health care professionals, arguing on the basis of unlawful processing of his personal data.

The case concerned a typical review platform where users may search for information about health care professionals. Aside from the review content, information such as name, address, expertise, contact data and opening hours are accessible on the platform. Users have to register with their email address before posting a review.

The Federal Court dismissed the claim. The court held that the platform´s freedom of communication outweighs the claimant´s right in informational self-determination, which forms the constitutional-right basis for privacy rights under German law. According to the court, it is legitimate for the platform provider to publish the practitioner´s profile and the review content based on Sec. 29 German Data Protection Act. This result does not come as a surprise, as the Federal Court already decided on a similar case back in 2008 that a teacher cannot request to be deleted from a review platform dedicated to teachers.

What is slightly more surprising is that the court made some remarks emphasizing that the practitioner would be “not insignificantly” burdened by the publication of reviews on the portal, as he may face adverse economic effects caused by negative reviews. However, the court saw even a greater weight in the public´s interest in information about medical services, in particular as the publication would only concern the “social sphere” of the claimant, rather than his private or intimate sphere.

In July 2014, the Federal Court also dismissed a claim for disclosure of contact details of a reviewer who repeatedly posted defamatory statements on a review platform.

 

 

Germany: Federal Court stops disclosure claims against review platforms

Posted on August 1st, 2014 by



In Germany, the Federal Court of Justice pulled the rug from under claims for the disclosure of user data against the providers of online services. The court ruled that statutory law would not permit a service provider to disclose user data to persons and businesses concerned by a negative and potentially unlawful review posted on a review platform (judgement of 1 July 2014, court ref. VI ZR 345/13). Only if the review constitutes a criminal act in itself, such as a defamation or slander, the prosecution may request disclosure in the course of a criminal investigation. The judgement eventually ended a debate that had been simmering for a long time.

Background

The case concerned a medical practitioner who sued a review platform dedicated to medical services. A user had posted a review on the platform in which he alleged that patients´ files would be kept in clothesbaskets, average waiting times would be extraordinarily long, follow-up appointments would not be offered in due time, a thyroid hyperfunction had not been identified and been treated contraindicative. Shortly afterwards, further reviews were posted which were identical in places to the first review. The claimant repeatedly notified the platform provider of these reviews, and the platform provider took the reviews down. In July 2012, another review was posted with the same allegations. The claimant now sued the platform provider for cessation and desistance and for disclosure of the name and the address of the user who posted the reviews. The defendant never denied that the facts stated in the reviews were untrue.

The Judgement

The claim was dismissed. The court´s decision is based on Sec. 12 (2) German Telemedia Act (“TMG”), which stipulates that a service provider may only disclose user data where a specific statute exists that permits such a disclosure and expressly references “Telemedia” services, i.e. online services. The court argued that the general civil law claim for disclosure of third-party data, which is based on bona fide aspects (Sec. 242 German Civil Code), would not fulfil the requirements of Sec. 12 (2) TMG. Further, the requirements of Sec. 14 (2) TMG, which allows for a disclosure of user data if this is necessary for the purposes of criminal prosecution, protection of the constitution, averting public dangers and national security, and for the enforcement of copyright, would not apply. According to the court, there is also no room for an analogous application of Sec. 14 (2) TMG, because there would not be an unintentional gap in the statutes as required for an analogy. In this regard, the court highlighted that the question whether an individual whose personality rights were unlawfully affected by a user posting should have a claim for disclosure of that user´s data was debated in the process of legislation without further consequences.

The court noted, however, that the result of the legal assessment may be regarded as unbalanced against the statutory right for disclosure of user data in the event of a copyright infringement, and that it deems the extension of this statutory right desirable. However, the court emphasized that this decision is up to the legislator, not the court.

Comment

The question of whether a claim for disclosure of user data would be supported by German civil law had long been debated in legal literature, and courts of lower instances had issued conflicting decisions in similar cases. The appellate court (Higher Regional Court of Stuttgart) had decided in favour of the claimant, too. This debate has now been ended by the Federal Court for the time being. The judgement is clear and leaves no room for interpretation or loopholes. This is good news for both providers of online platforms, who can safely assure to their users that their identity is protected, and users who will not need to fear de-anonymization, which could result in a pre-emptive self-limitation when posting comments.

However, the question remains whether the court duly considered constitutional law aspects, as the German-law concept of personality rights is rooted in the German constitution (right to human dignity, right to personal freedoms). This has been the main reason why some courts of lower instances had obvious concerns about the result of their legal assessment and tried to find a way out of the dilemma by applying analogies, or considerations of interest, on dubious legal grounds to overcome the statutory law situation which had been deemed inappropriate in some cases. The Federal Court now has not touched constitutional law issues, so it can be concluded that at least it did not see a blatant violation of constitutional law. However, the Federal Court articulated concerns about the outcome too by declaring a revision of the statutes desirable, and by emphasizing the responsibility of the legislator to consider respective amendments of the law. Even though the judgement is final and binding, the claimant may seek additional relief by lodging a constitutional complaint.

The decision does not affect the right of the competent authorities to request a disclosure of user data in the case of criminal prosecution, i.e. in cases where the content of a user review does not only constitute a violation of personality rights as protected by civil law, but reach the threshold of criminal offences such as in the case of defamation and slander.

UK to introduce emergency data retention measures

Posted on July 15th, 2014 by



The UK Prime Minister David Cameron announced last week that the Government is taking emergency measures to fast track new legislation, The Data Retention and Investigations Powers Bill, which will force communications service providers (i.e. telecommunications companies and internet service providers, together “CSPs“) to store communications data (including call and internet search metadata) for 12 months.

This announcement follows the CJEU’s ruling in April that the Data Retention Directive 2006/24/EC (the “Directive“), which requires companies to store communications data for up to two years, is invalid because it contravenes the right to privacy and data protection and the principle of proportionality under the EU Charter of Fundamental Rights (the “Charter“). The CJEU was particularly concerned about the lack of restrictions on how, why and when data could be used. It called for a measure which was more specific in terms of crimes covered and respective retention periods.

The PM said that the emergency law was necessary to protect existing interception capabilities, and that without it, the Government would be less able to protect the country from paedophiles, terrorists and other serious criminals. Cameron said the new legislation will respond to the CJEU’s concerns and provide a clear legal basis for companies to retain such communications data and also stressed that the new measures would cover the retention of only metadata, such as the time, place and frequency of communications, and would not cover the content of communications.  The emergency Bill is intended as a temporary measure and is to expire in 2016. The Government intends that the legislation will ensure that, in the short term, UK security and law enforcement agencies can continue to function whilst Parliament has time to examine the Regulation of Investigatory Powers Act 2000 (RIPA) to make recommendations on how it could be modernised and improved. Whilst Cameron stressed that the measures did not impose new obligations on CSPs and insisted they would not authorise new intrusions on civil liberties, the Bill faces criticism that it extends on the already far reaching interception rights under RIPA and also that in light of the CJEU decision, the temporary measure also contravenes the Charter.

At present, in order to comply with their obligations under the Directive, CSPs already operate significant storage and retrieval systems to retain data from which they can derive no further use or revenue. If the draft Bill is enacted with little further amendment, the UK’s Secretary of State could be issuing new retention notices later this year. Those CSPs subject to retention obligations today will be reading carefully as these arrive. It is not yet clear whether the legislative burden and cost of compliance is likely to spread to additional CSPs not previously notified under the current retention regime. From the Bill’s drafting it appears this could conceivably happen. It is equally clear that there is no mechanism to recoup these costs other than from their general business operations.

Britain is the first EU country to seek to rewrite its laws to continue data retention since the CJEU decision, and the Government said it was in close contact with other European states on the issue.

By comparison, in Germany, when the Directive was initially implemented, the German courts took the view that the German implementation of it by far exceeded the limits set by the German constitutional right of informational self-determination of the individual in that it did not narrow down the scope of use of the retained data sufficiently, e. g., by not limiting it to the prosecution or prevention of certain severe criminal acts. In Germany’s new Telecommunication Act, enacted in 2012, the provisions pertaining to data retention were deleted and not replaced by the compulsory principles in the Directive. Treaty violation proceedings against Germany by the EU Commission ensued, however the proceedings have now lost their grounds entirely as a result of the CJEU ruling.

Meanwhile the Constitutional Court of Austria last month declared that Austrian data retention laws were unconstitutional. Austria is the first EU Member State to annul data retention laws in response to the CJEU decision.  Austrian companies are now only obliged to retain data for specific purposes provided by law, such as billing of fault recovery.

Whether other EU countries will now follow the UK’s lead, potentially introducing a patchwork of data retention standards for CSPs throughout the EU, remains to be seen. If this happens, then equally uncertain is the conflict this will create between, on the one hand, nationally-driven data retention standards and, on the other, EU fundamental rights of privacy and data protection.

 

European Parliament votes in favour of data protection reform

Posted on March 21st, 2014 by



On 12 March 2014, the European Parliament (the “Parliament”) overwhelmingly voted in favour of the European Commission’s proposal for a Data Protection Regulation (the “Data Protection Regulation”) in its plenary assembly. In total 621 members of Parliament voted for the proposals and only 10 against. The vote cemented the Parliament’s support of the data protection reform, which constitutes an important step forward in the legislative procedure. Following the vote, Viviane Reding – the EU Justice Commissioner – said that “The message the European Parliament is sending is unequivocal: This reform is a necessity, and now it is irreversible”. While this vote is an important milestone in the adoption process, there are still several steps to go before the text is adopted and comes into force.

So what happens next?

Following the Civil Liberties, Justice and Home Affairs (LIBE) Committee’s report published in October 2013 (for more information on this report – see this previous article), this month’s vote  means that the Council of the European Union (the “Council”) can now formally conduct its reading of the text based on the Parliament’s amendments. Since the EU Commission made its proposal, preparatory work in the Council has been running in parallel with the Parliament. However, the Council can only adopt its position after the Parliament has acted.

In order for the proposed Data Protection Regulation to become law, both the Parliament and the Council must adopt the text in what is called the “ordinary legislative procedure” – a process in which the decisions of the Parliament and the Council have the same weight. The Parliament can only begin official negotiations with the Council as soon as the Council presents its position. It seems unlikely that the Council will accept the Parliament’s position and, on the contrary, will want to put forward its own amendments.

In the meantime, representatives of the Parliament, the Council and the Commission will probably organise informal meetings, the so-called “trilogue” meetings, with a view to reaching a first reading agreement.

The EU Justice Ministers have already met several times in Council meetings in the past months to discuss the data protection reform. Although there seems to be a large support between Member States for the proposal, they haven’t yet reached an agreement over some of the key provisions, such as the “one-stop shop” rule. The next meeting of the Council ministers is due to take place in June 2014.

Will there be further delays?

As the Council has not yet agreed its position, the speed of the development of the proposed regulation in the coming months largely depends on this being finalised. Once a position has been reached by the Council then there is also the possibility that the proposals could be amended further. If this happens, the Parliament may need to vote again until the process is complete.

Furthermore, with the elections in the EU Parliament coming up this May, this means that the whole adoption process will be put on hold until a new Parliament comes into place and a new Commission is approved in the autumn this year. Given these important political changes, it is difficult to predict when the Data Protection Regulation will be finally adopted.

It is worth noting, however, that the European heads of state and government publicly committed themselves to the ‘timely’ adoption of the data protection legislation by 2015 – though, with the slow progress made to date and work still remaining to be done, this looks a very tall order indeed.

CNIL issues new guidelines on the processing of bank card details

Posted on February 27th, 2014 by



On February 25, 2014, the French Data Protection Authority (“CNIL”) issued a press release regarding new guidelines adopted last November on the processing of bank card details relating to the sale of goods and the provision of services at a distance (the “Guidelines”). Due to the increase of on-line transactions and the higher number of complaints received by the CNIL from customers in recent years, the CNIL decided to update and repeal its previous guidelines, which dated from 2003. The new guidelines apply to all types of bank cards including private payment cards and credit cards.

Purposes of processing

The CNIL defines the main purpose of using a bank card number as processing a transaction with a view to delivering goods or providing a service in return for payment. In addition, bank card details may be processed for the following purposes:

  • to reserve a good or service;
  • to create a payment account to facilitate future payments on a merchant’s website;
  • to enable payment service providers to offer dedicated payment solutions at a distance (e.g., virtual cards or wallets, rechargeable accounts, etc.); and
  • to combat fraud.

Types of data collected

As a general rule, the types of data that are strictly necessary to process online payments should be limited to:

  • the bank card number;
  • the expiry date; and
  • the 3 digit cryptogram number on the back of the card.

The cardholder’s identity must not be collected, unless it is necessary for a specific and legitimate purpose, such as to combat fraud.

Period of retention

Bank card details may only be stored for the duration that is necessary to process the transaction, and must be deleted once the payment has taken place (or, where applicable, at the end of the period corresponding to the right of withdrawal). Following this period, the bank card details may be archived and kept for 13 months (or 15 months in the case of a deferred debit card) for evidence purposes (e.g., in case of a dispute over a transaction).

Beyond this period, the bank card details may be kept only if the cardholder’s prior consent is obtained or to prevent fraudulent use of the card. In particular, the merchant must obtain the customer’s prior consent in order to create a payment account that remembers the customer’s bank card details for future payments.

However, the CNIL considers that the 3-digit cryptogram on the card is meant to verify that the cardholder is in possession of his/her card, and thus, it is prohibited to store this number after the end of the transaction, including for future payments.

Security measures

Due to the risk of fraud, controllers must implement appropriate security measures, including preventing unauthorized access to, or use of, the data. These security measures must comply with applicable industry standards and requirements, such as the Payment Card Industry Data Security Standards (PCI DSS), which must be adopted by all organizations with payment card data.

The CNIL recommends that the customer’s bank card details are not stored on his/her terminal equipment (e.g., computer, smartphone) due to the lack of appropriate security measures. Furthermore, bank card numbers cannot be used as a means of customer identification.

For security reasons (including those that are imposed on the cardholder), the controller (or processor) cannot request a copy of the bank card to process a payment.

Finally, the CNIL recommends notifying the cardholder if his/her bank card details are breached in order to limit the risk of fraudulent use of the bank card details (e.g., to ask the bank to block the card if there is a risk of fraud).

Future legislation

In light of the anticipated adoption of the Data Protection Regulation, organizations will face more stringent obligations, including privacy-by-design, privacy impact assessments and more transparent privacy policies.

CNIL amends legal framework for whistleblowing schemes in France

Posted on February 25th, 2014 by



In France, the legal framework for whistleblowing schemes is based on a decision of the French Data Protection Authority (the “CNIL”) of 2005 adopting a “single authorization” AU-004  for the processing of personal data in the context of whistleblowing schemes. In principle, companies must obtain the CNIL’s approval prior to implementing a whistleblowing scheme. The CNIL’s single authorization AU-004 allows companies to do so simply via a self-certification procedure, whereby they make a formal undertaking that their whistleblowing hotline complies with the pre-established conditions set out in the CNIL’s single authorization AU-004.

Initially, companies could only self-certify to the CNIL’s single authorization AU-004 if they were required to adopt a whistleblowing scheme either to comply with legal or regulatory requirements in specific and limited areas (i.e., finance, accounting, banking, fight against corruption), or if they could demonstrate a legitimate purpose, which at the time, was limited to complying with Section 301(4) of the U.S. Sarbanes-Oxley Act. In 2010, the CNIL broadened the scope of its single authorization by expanding the “legitimate purpose” condition to cover two new areas: compliance with the Japanese Financial Instruments Act and the prevention of anti-competitive practices (i.e., anti-trust matters). On January 30th, 2014, the CNIL amended its single authorization AU-004 a second time, essentially to add the following areas to the scope of whistleblowing schemes: the fight against discriminations and work harassment, compliance with health, hygiene and safety measures at the workplace, and the protection of the environment.

These successive amendments show that the CNIL’s view on whistleblowing schemes has evolved over time and it has adopted a more realistic and pragmatic approach given that, in today’s world, many multinational organizations require their affiliates to implement a streamlined and globalized whistleblowing scheme across multiple jurisdictions. Under the revised framework, whistleblowing schemes are still limited to pre-defined areas and cannot be used for general and unlimited purposes. Nevertheless, the broadened scope of whistleblowing schemes allows companies and their employees to act more in line with an organization’s internal code of business conduct and the various areas that it covers. The CNIL’s decision should therefore enable companies to use their whistleblowing schemes more consistently across jurisdictions and to streamline the reporting process in areas that are commonly recognized as fraudulent or unethical.

The CNIL also clarified its position regarding anonymous reporting. Historically, the CNIL considers that anonymous reporting creates a high risk of slanderous reporting and can have a disruptive effect for companies. In its decision of January 30, 2014, the CNIL states that organizations must not encourage individuals to make anonymous reports and, on the contrary, anonymous reporting should remain exceptional. The CNIL also specifies the conditions that apply to anonymous reporting, namely:

– the seriousness of the facts that were reported must be established and the facts must be sufficiently precise; and

– the anonymous report must be handled with specific precautions. For example, the initial receiver of the report should assess whether it is appropriate to disclose the facts within the whistleblowing framework prior to doing so.

The CNIL’s intention here is to limit the risk of slanderous reporting by encouraging companies to establish a clear and transparent system for employees, while ensuring that the appropriate security and confidentiality measures have been implemented, particularly to protect the identity of the whistleblower.

Effectively, the revision of the CNIL’s single authorization AU-004 can also be viewed as a tactical move by the CNIL to funnel companies through the self-certification approval process, rather than to seek ad hoc approval from the CNIL. It also encourages companies to be more transparent regarding the purposes for which their whistleblowing schemes are used and allows the CNIL to enforce compliance with the Data Protection Act more efficiently.

The CNIL’s decision does not specify any date of entry into force. Therefore, these amendments came into force on January 30th, 2014, date of the publication of the decision in the Official Journal. The decision also does not specify any grace period for complying with the new conditions; therefore, companies are required to comply with them immediately.

This article was initially published in the March 2014 edition of the The Privacy Advisor.