Europe’s new data protection law, the General Data Protection Regulation (or “GDPR” for short) is an undeniably complex piece of legislation. Privacy professionals everywhere, this one included, have a lot to learn and - thankfully - there have been many excellent articles written on the topic. For the most part, these focus on the changes that the GDPR will bring about and, specifically, the compliance actions that organisations must take.
By contrast, less has been said about what the new law will NOT require. This might sound unsurprising (why would anyone want to know about things they don’t need to do?) but it’s important to remember that, during the course of its adoption, the text of the GDPR changed many times. As a result, some provisions that were originally proposed were dropped from the final law (or otherwise changed beyond recognition), and this inevitably created a certain amount of confusion. Then throw in a sprinkling of occasional misreporting, together with a dash of Chinese whispers, and suddenly knowing what the law does NOT require becomes almost as important as knowing what it does require.
Because of that, I thought I’d dispel a few of the most common misconceptions the Fieldfisher Privacy, Security and Information team have heard. In no particular order, these are:
1. Controllers don’t need data processing agreements with processors because the GDPR imposes direct obligations on processors: WRONG! You wouldn’t think it, but we’ve heard this one quite a few times. Let’s set the record straight: data processing agreements are definitely still needed, and there’s a whole host of contractual terms that must be put in place between a controller and its processor(s). If anyone tells you otherwise, sigh heavily and tell them to go and read Art 28 of the GDPR. Watch with satisfaction as their eyes widen.
2. When relying on consent to process personal data, consent must be explicit: WRONG! This was a hotly debated topic during the passage of the GDPR, but the final text requires that consent must be “unambiguous”, not “explicit” (Art 4(11)). Explicit consent is required only for processing sensitive personal data - in this context, nothing short of “opt in” will suffice (Art 9(2)). But for non-sensitive data, “unambiguous” consent will do - and this allows the possibility of implied consent if an individual’s actions are sufficiently indicative of their agreement to processing.
3. Everyone needs a Data Protection Officer: WRONG! DPOs must only be appointed in the case of: (a) public authorities, (b) organizations that engage in large scale systematic monitoring, or (c) organizations that engage in large scale processing of sensitive personal data (Art. 37). If you don’t fall into one of these categories, then you don’t have to appoint a DPO - though appointing one is, of course, still to be encouraged in the interests of good practice!
4. Controllers and processors will only have to answer to a single data protection authority: WRONG! This may have been the original intent when the draft GDPR was published back in 2012, but it’s not where the final law ended up. While it’s true that organizations will have a ‘lead’ supervisory authority, other supervisory authorities can intervene if an issue relates to a controller or processor established in their Member State or if data subjects in their Member State are otherwise substantially affected (Art. 56).
5. Biometric data is sensitive data under the GDPR: WRONG(ISH)! You can be forgiven for thinking this. Biometric data can be sensitive data under the GDPR - but only if used for the purpose of “uniquely identifying” someone (Art. 9(1)). A bunch of photographs uploaded onto a cloud service would not be considered sensitive data, for example, unless used for identification purposes - think, for instance, of airport security barriers that recognize you from your passport photograph.
6. Individuals have an absolute right to be forgotten: WRONG! The GDPR refers to the ‘right to be forgotten’ as the ‘right of erasure’ (Art. 17). However, unlike the right to opt-out of direct marketing, it’s not an absolute right. Organizations may continue to process data if the data remains necessary for the purposes for which it was originally collected, and the organization still has a legal ground for processing the data under Art. 6 (and, if sensitive data is concerned, Art. 9 too).
7. Parental consent is always required when collecting personal data from children: WRONG! Parental consent is required only if the processing itself is legitimized on the basis of consent. If the processing is based on another lawful processing ground (for example, compliance with a legal obligation, vital interests, or possibly even legitimate interests), then parental consent is not required. See Art. 8(1) if you don’t believe us!
8. Every business will be subject to new data portability rules: WRONG! Data portability requirements are mandated only when processing is based on consent or contractual necessity (Art 20(1)). It does not apply when, for example, processing is based on legitimate interests. This is an important strategic point for businesses to consider when deciding upon the lawful grounds on which they will process personal data.
9. Profiling activities always require consent: WRONG! Consent is only required if the profiling activities in question “produces legal effects” or “significantly affects” a data subject (Art 22(1)). The obvious implication here is for the targeted advertising industry - whether you like or loathe targeted advertising, it’s a bit of a stretch to say that data processing for the purpose of serving targeted ads has these consequences. Put another way, the GDPR does not generally mandate consent for the profiling activities of ad tech companies.
10. Pseudonymised data (e.g. hashed data) are treated exactly like any other personal data under the GDPR: WRONG(ISH)! The GDPR makes clear that data protection rules apply to pseudonymised data, but pseudonymised data implicitly benefits from certain relaxations under the GDPR - for example, mandatory data breaching reporting may arguably not apply if data has been securely pseudonymised (Art. 33 - on the basis that securely pseudonymised data is “unlikely” to create risk). See also Art 11, which seemingly relaxes certain data subject rights for pseudonymised data.
That’s it for our top 10 list. If you’ve heard other good ones, drop us a line and maybe we’ll update this post at a future point. For now, though, consider yourself informed - and next time you hear any of these come up in conversation, be sure to show off your data protection prowess and set the record straight!