It was back in 2000 during the heady days of the original ‘dot com boom’ and when Mission Impossible II, Gladiator and What Women Want were riding high at the box office that the ICO released its first CCTV Code of Practice. However, as the ICO points out a lot has changed since then with ‘surveillance systems’ no longer being a camera on top of a pole recording images of a town centre via video tape. Despite a further update in 2008, the ICO has rightly felt it necessary to revisit the data protection considerations for organisations in light of intelligent CCTV systems, Automatic Number Plate Recognition (ANPR), body worn video (BWV) and drones (referred to as ‘Unmanned Aerial Systems’ (UAS)). For this reason, the new Code is to be welcomed as it addresses the myriad considerations and challenges presented by these technologies.
The new CCTV Code
The Code recognises the intrusiveness that such technologies pose and emphasises the need for proportionality and, particularly, ‘necessity’ when an organisation is deciding whether to implement a surveillance system. Therefore the Code asserts the need for tools such as Privacy Impact Assessments (PIAs) and Privacy by Design solutions, both at the initial stages of the project and through its lifecycle. Indeed, in the case of more intrusive recording systems such as BWV, which offer significant mobility advantages for the recording of images and sounds, the Code emphasises that in addition to ensuring that it is ‘necessary’ and ‘proportionate’ in response to an issue, there must also exist ‘a pressing social need’ for it. The bottom line is that the ICO considers that the intrusiveness of such systems means that they should not be implemented just because it is possible, affordable or they have public support.
Interestingly, when stressing the need for such tools, the Code also reminds developers and vendors of surveillance systems that whilst they may not be ‘data controllers’ under the Data Protection Act, they should nevertheless be aware of and build in PIAs and Privacy by Design tools when taking their products to market. This also extends to purchasers of systems, who are reminded that just because a system is available for purchase or comes with a large storage facility, this does not mean that it is necessarily privacy-compliant. Users must therefore tailor the system accordingly, including by updating the standard ‘factory’ settings for the recording and storage of information.
The nature of these systems means that the volume and intrusiveness of the data collected is now potentially much greater. CCTV cameras for example are no longer a passive technology that only records and retains images, but are now proactive and can be used to identify people of interest and keep detailed records of people’s activities – such as with ANPR cameras. Additionally, camera feeds may be shared between different public authorities via a common control room operated by a third party in order to cut back on running costs. Not only does this present considerations around existing data protection concepts such as the need to identify ‘controller and processor’ relationships and apportioning responsibilities via appropriate contractual agreements, but it also presents further issues around ‘Big Data’ and individual profiling. This requires the drafting of appropriate data-sharing agreements, data retention policies and implementing data security practices. The ICO also points out that the principles of the Data Protection Act and the corresponding recommendations in the Code should be followed throughout the lifecycle of the system, including the system upgrades.
And of course, more ‘traditional’ data protection concepts such as providing effective ‘notice’, being ready to fulfil Subject Access Requests and dealing with FOI requests continue to apply. In particular, the Code highlights the need for appropriate staff training, documented processes and considers applicable exemptions in the case of SARs. Finally, the Code sets out the regulator’s expectations that that appropriate signs and audio recordings should be used to inform individuals when they are entering an area where surveillance may be taking place. This will require creative thinking by organisations as they grapple with how to handle these issues; for example in the case of drones the guidance recommends that the drone operator may wear ‘highly visible clothing’ to identify themselves as such in order to satisfy the fair processing requirements under the DPA.
Clearly the development of new surveillance technologies is creating privacy challenges. The ICO Code and the current sensitivity of the public in relation to surveillance emphasise the importance of ensuring that the use of surveillance systems by organisations does not cross any red lines. At the time of writing this piece, UK news articles have focused on figures that the number of organisations given permits to use drones in the skies over Britain, including police forces and film makers has increased by 80% since the beginning of the year. Therefore, this ICO guidance is very timely, as the calls for increased regulation over the use of surveillance systems is increasing. Whilst it is impossible to anticipate all the challenges presented by these nascent technologies, the Code provides useful guidance on the key data protection issues and practical examples and check-lists that organisations should incorporate when deciding whether to use these new technologies.
The technologies may be more complex, but the ICO stresses that it still expects full compliance with the DPA. Organisations who are contemplating new uses of surveillance should therefore carry out a Privacy Impact Assessment to identify the privacy issues and decide the steps that they should take to ensure that their use of a new system is compliant and any privacy risks are effectively managed.