This may sound as an overstatement but privacy impact assessments (PIAs) are likely to become the most vital item in the privacy professionals’ toolkit. One of the earliest guidance documents into the world of PIAs, the New Zealand Office of the Privacy Commissioner PIA Handbook, describes the concept as a systematic process that evaluates a proposal in terms of its impact upon privacy. That is a slightly abstract description but it captures some of the crucial elements that make this tool so useful. The reference to a PIA being a systematic process means that those who put it into practice should ideally follow an established approach that suits the culture and operations of the organisation. In other words, whatever a PIA is designed to deliver, it is essential that it is embedded in the workings of the organisation and is seen as sufficiently meaningful and constructive.
Being constructive is in fact more than an aspiration, but the essence of the whole idea. Fans of PIAs quickly point out that a PIA should be distinguished from a privacy compliance audit. The reason for this is that audits look at whether and how effectively compliance is being achieved. PIAs on the other hand, look at a proposed new system, operation or product and tell us how it will fare from a privacy perspective. The emphasis is on the future, which if anything, makes PIAs ideal for assessing the privacy implications of ever evolving technology. But in addition to being as dynamic as the technological developments and proposed activities assessed through them, PIAs are also meant to make a privacy-friendly contribution to such developments and activities.
PIAs are also constructive because they seek to allow the aims of the proposed activity to be met as far as possible. This feature makes this tool particularly useful for privacy professionals. There is no much point in trying to defend privacy protection as something that adds value if the outcome of a privacy assessment is to close doors to innovation and progress. Privacy professionals need to be seen as being on the side of the organisation – whilst remaining robust in their outlook – and PIAs are an effective tool for doing that. PIAs should still be rigorous even if they are simple to execute. PIAs should be meaningful as well as flexible. But above all, they should be sending a powerful message within the organisations where they take place: assessing the impact of an intended development on people’s privacy and coming up with sensible ways of preventing unjustifiable risks is for everyone’s benefit, from software developers to customers and from suppliers to the CEO.
The point about preventing risks should not be underestimated. This is something that all guidance available seems to emphasise. Even when looked at it from a European perspective, the justification for doing a PIA rests on minimising risks to privacy. The UK Information Commissioner’s new draft Code of Practice on PIAs makes various numerous references to the fact that PIAs are there to spot all types of privacy-related risks, including risks to individual privacy, compliance risks and related corporate or organisational risks. This gives us a very visible clue of the direction in which even EU regulators are looking, which is incredibly helpful to guide the strategic thinking of privacy professionals.
This is what makes PIAs particularly relevant in the context of global compliance. A compliance audit is more likely to focus on the legal obligations of a given regime, but when trying to address privacy needs at a global scale, a PIA will be a much more useful and practical tool. To the extent that a PIA needs to follow a methodology, this can be based on globally recognised principles rather than narrowly prescribed legal obligations. A PIA is more than a mechanism for compliance. It is a mechanism for making organisations think about privacy at the time when the ideas are flowing and the level of enthusiasm is high, and it does that through a risk-based globally applicable process. Welcome to privacy management for the 21st century!
This article was first published in Data Protection Law & Policy in August 2013