Skip to main content
Log in

A systematic methodology for privacy impact assessments: a design science approach

  • Research Essay
  • Published:
European Journal of Information Systems

Abstract

For companies that develop and operate IT applications that process the personal data of customers and employees, a major problem is protecting these data and preventing privacy breaches. Failure to adequately address this problem can result in considerable damage to the company’s reputation and finances, as well as negative effects for customers or employees (data subjects). To address this problem, we propose a methodology that systematically considers privacy issues by using a step-by-step privacy impact assessment (PIA). Existing PIA approaches cannot be applied easily because they are improperly structured or imprecise and lengthy. We argue that companies that employ our PIA can achieve ‘privacy-by-design’, which is widely heralded by data protection authorities. In fact, the German Federal Office for Information Security (BSI) ratified the approach we present in this article for the technical field of RFID and published it as a guideline in November 2011. The contribution of the artefacts we created is twofold: First, we provide a formal problem representation structure for the analysis of privacy requirements. Second, we reduce the complexity of the privacy regulation landscape for practitioners who need to make privacy management decisions for their IT applications.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Figure 1
Figure 2
Figure 3
Figure 4
Figure 5

Similar content being viewed by others

References

  • Alhadeff J, Van Alsenoy B and Dumortier J (2012) The accountability principle in data protection regulation: origin, development and future directions. In Managing Accountability through Privacy, pp 49–82, Palgrave Macmillan, New York.

  • Avison D, Lau F, Myers MD and Nielsen PA (1999) Action research. Communications of the ACM 42 (1), 28–45.

    Article  Google Scholar 

  • Baskerville R (1999) Investigating information systems with action research. Communications of the AIS 2 (3), 1–32.

    Google Scholar 

  • Baskerville RL and Wood-Harper AT (1996) A critical perspective on action research as a method for information system research. Journal of Information Technology 11 (3), 235–246.

    Article  Google Scholar 

  • BBBOnLine. (2011) BBBOnLine – BBB Accredited Business Seal. [WWW document] http://www.bbb.org/online/ (accessed 5 December 2011).

  • Bennett C and Bayley R (2007) Privacy Impact Assessments: International Study of their Application and Effects, Loughborough University, UK.

    Google Scholar 

  • BSI (Bundesamt für Sicherheit in der Informationstechnik). (2008) Risk analysis on the basis of IT-Grundschutz, BSI Standard 100-3. [WWW document] https://www.bsi.bund.de/ContentBSI/Publikationen/BSI_Standard/it_grundschutzstandards.html#doc471418bodyText3 (accessed 20 March 2012).

  • BSI (Bundesamt für Sicherheit in der Informationstechnik). (2011a) IT-Grundschutz-Kataloge. [WWW document] https://www.bsi.bund.de/DE/Themen/ITGrundschutz/StartseiteITGrundschutz/startseiteitgrundschutz_node.html (accessed 29 February 2012).

  • BSI (Bundesamt für Sicherheit in der Informationstechnik). (2011b) Privacy impact assessment guideline for RFID applications. [WWW document] https://www.bsi.bund.de/SharedDocs/Downloads/DE/BSI/ElekAusweise/PIA/Privacy_Impact_Assessment_Guideline_Langfassung.pdf;jsessionid=4BE04C3871C6AEB0CD78E76F22F0153A.2_cid244?__blob=publicationFile (accessed 7 March 2012).

  • Cavoukian A (2009a) Privacy by design… Take the challenge. Information and Privacy Commissioner of Ontario (Canada). [WWW document] http://www.ipc.on.ca/images/Resources/PrivacybyDesignBook.pdf (accessed 10 October 2012).

  • Cavoukian A (2009b) Privacy by design: the 7 foundational principles. Information and Privacy Commissioner of Ontario (Canada). [WWW document] http://privacybydesign.ca/about/principles/ (accessed 10 October 2012).

  • Clarke R (2009) Privacy impact assessment: its origins and development. Computer Law & Security Review 25 (2), 123–135.

    Article  Google Scholar 

  • Clarke R (2011) An evaluation of privacy impact assessment guidance documents. International Data Privacy Law 1 (2), 111–120.

    Article  Google Scholar 

  • Cranor LF et al (2006) The platform for privacy preferences 1.1 (P3P1.1) Specification – W3C Working Group Note 13 November 2006. [WWW document] http://www.w3.org/TR/P3P11/ (accessed 1 March 2012).

  • Davison RM, Martinsons MG and Kock N (2004) Principles of canonical action research. Information Systems Journal 14 (1), 65–89.

    Article  Google Scholar 

  • De Hert P, Kloza D and Wright D (2012) A privacy impact assessment framework for data protection and privacy rights. Deliverable D3 of the EU PIAF Project – Recommendations for a privacy impact assessment framework for the European Union. Brussels.

  • Director of the Spanish Data Protection Agency (SDPA). (2009) Standards on the Protection of Personal Data and Privacy – The Madrid Resolution. 5 November 2009, Madrid.

  • ENDORSE. (2011) ENDORSE Project. [WWW document] http://ict-endorse.eu/ (accessed 1 March 2012).

  • EC (European Parliament and Council of the European Union). (1995) Directive 95/46/EC on the protection of individuals with regard to the processing of personal data and on the free movement of such data. Official Journal of the European Communities L 281/31: 31–50.

  • EC (Commission of the European Communities). (2009) Commission Recommendation on the Implementation of Privacy and Data Protection Principles in Applications Supported by Radio-Frequency Identification, EC (Commission of the European Communities), Brussels.

  • EC (European Commission). (2012) Proposal for a regulation of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation), COM(2012) 11 final. 25 January, Brussels.

  • EuroPriSe. (2011) EuroPriSe – European privacy seal. [WWW document] https://www.european-privacy-seal.eu/ (accessed 5 December 2011).

  • FTC (Federal Trade Commission). (1998) Fair Information Practice Principles.

  • Fujitsu (2010) Personal data in the cloud: a global survey of consumer attitudes. [WWW document] http://www.fujitsu.com/global/news/publications/dataprivacy.html (accessed 20 March 2012).

  • Gregor S (2006) The nature of theory in information systems. MIS Quarterly 30 (3), 611–642.

    Google Scholar 

  • Hevner AR (2007) A three cycle view of design science research. Scandinavian Journal of Information Systems 19 (2), 87–92.

    Google Scholar 

  • Hevner AR, March ST, Park J and Ram S (2004) Design science in information systems research. MIS Quarterly 28 (1), 75–105.

    Google Scholar 

  • Iivari J (2007) A paradigmatic analysis of information systems as a design science. Scandinavian Journal of Information Systems 19 (2), 39–64.

    Google Scholar 

  • Information & Privacy Commissioner of Ontario (IPCO). (2011) Privacy by design. [WWW document] http://privacybydesign.ca (accessed 7 February 2011).

  • intelligentPIA (iPIA). (2011) intelligentPIA – a privacy impact assessment tool. [WWW document] http://www.wu.ac.at/ec/research/ipia (accessed 1 March 2012).

  • INFSO (European Commission, Information Society and Media Directorate-General). (2011) Privacy and data protection impact assessment framework for RFID applications, 12 January 2011, Brussels.

  • ISO (International Organization for Standardization). (2002) ISO FDIS 22307 Financial Services – privacy impact assessment.

  • ISO (International Organization for Standardization). (2008) ISO/IEC 27005 Information technology – security techniques – information security risk management.

  • ISO (International Organization for Standardization). (2009) ISO/IEC 31000 risk management – principles and guidelines on implementation.

  • ISO (International Organization for Standardization). (2011) ISO/IEC CD 29101.4 Information technology – security techniques – privacy architecture framework.

  • Jeselon P and Fineberg A (2011) A Foundational Framework for a PbD-PIA, Information and Privacy Commission Canada, Toronto, ON.

    Google Scholar 

  • Linstone, HA and Turoff, M, Eds (1975) The Delphi Method: Techniques and Application, Addison Wesley, London.

    Google Scholar 

  • Moor JH (1998) Reason, relativity, and responsibility in computer ethics. Computers and Society 28 (1), 14–21.

    Article  Google Scholar 

  • Naoe K (2008) Design culture and acceptable risk. In Philosophy and Design – From Engineering to Architecture (Vermaas PE, Kroes P, Light A and Moore SA, Eds) pp 119–130, Springer, Amsterdam.

    Google Scholar 

  • NIST (National Institute of Standards and Technology). (2002) Risk management guide for information technology systems, NIST Special Publication 800-30.

  • OECD (Organisation for Economic Cooperation and Development). (1980) Guidelines on the protection of privacy and transborder flows of personal data.

  • Pries-Heje J, Baskerville R and Venable JR (2008) Strategies for design science research evaluation. ECIS 2008 Proceedings, paper 87.

  • Raab C and Wright D (2012) Surveillance: extending the limits of privacy impact assessments. In Privacy Impact Assessment (Wright D and De Hert P, Eds) pp 363–383, Springer, Dordrecht.

    Chapter  Google Scholar 

  • Rost M (2011) Datenschutz in 3D. Datenschutz und Datensicherheit – DuD 5 (35), 351–354.

    Article  Google Scholar 

  • Rost M and Bock K (2011) Privacy by design und die Neuen Schutzziele. Datenschutz und Datensicherheit – DuD 35 (1), 30–35.

    Article  Google Scholar 

  • Rost M and Pfitzmann A (2009) Datenschutz-Schutzziele – revisited. Datenschutz und Datensicherheit – DuD 33 (6), 353–358.

    Article  Google Scholar 

  • Scannapieco M, Missier P and Batini C (2005) Data quality at a glance. Datenbank-Spektrum 14/2005.

  • Seibild H (2006) IT-Risikomanagement, Oldenbourg Verlag, München, Wien.

    Book  Google Scholar 

  • Shroff M (2007) Privacy Impact Assessment Handbook, Report, Office of the Privacy Commissioner, Auckland, New Zealand.

    Google Scholar 

  • Siponen M (2006) Information security standards – focus on the existence of process, not its content. Communications of the ACM 49 (8), 97–100.

    Article  Google Scholar 

  • Siponen M and Willison R (2009) Information security management standards: problems and solutions. Information & Management 46 (5), 267–270.

    Article  Google Scholar 

  • Solove DJ (2002) Conceptualizing privacy. California Law Review 90 (4), 1087–1156.

    Article  Google Scholar 

  • Solove DJ (2006) A taxonomy of privacy. University of Pennsylvania Law Review 154 (3), 477–560.

    Article  Google Scholar 

  • Spiekermann S (2008) User Control in Ubiquitous Computing: Design Alternatives and User Acceptance, Shaker Verlag, Aachen.

    Google Scholar 

  • Spiekermann S (2012) The challenges of privacy by design. Communications of the ACM 55 (7), 38–40.

    Article  Google Scholar 

  • Stewart B (1996) Privacy impact assessments. Privacy Law and Policy Reporter 3 (4), 61–64.

    Google Scholar 

  • Susman GI and Evered RD (1978) An assessment of the scientific merits of action research. Administrative Science Quarterly 23 (4), 582–603.

    Article  Google Scholar 

  • TRUSTe. (2011) TRUSTe privacy seal. [WWW document] http://www.truste.com/ (accessed 5 December 2011).

  • UK Information Commissioners Office (ICO). (2009) Privacy Impact Assessment Handbook (Version 2.0), UK Information Commissioners Office (ICO), London.

  • Van Gorp A and de Poel IV (2008) Deciding on ethical issues in engineering design. In Philosophy and Design – From Engineering to Architecture (Vermaas PE, Kroes P, Light A and Moore SA, Eds) pp 77–89, Springer, Amsterdam.

    Google Scholar 

  • Venable J (2006) A framework for design science research activities. In Proceedings of the Information Resource Management Association Conference, Washington, DC, USA, 21–24 May.

    Google Scholar 

  • Walls JG, Widmeyer GR and El Sawy OA (1992) Building an information system design theory for vigilant EIS. Information Systems Research 3 (1), 36–59.

    Article  Google Scholar 

  • Warren SD and Brandeis LD (1890) The right to privacy. Harvard Law Review 4 (5), 193–220.

    Article  Google Scholar 

  • Westin AF (1967) Privacy and Freedom, Atheneum, New York.

    Google Scholar 

  • Wright D (2011) Should privacy impact assessments be mandatory? Communications of ACM 54 (8), 121–131.

    Article  Google Scholar 

  • Wright D and De Hert P (2012) Privacy Impact Assessment, Springer, Dordrecht.

    Book  Google Scholar 

  • Wright D, Wadhwa K, De Hert P and Kloza D (2011) A Privacy Impact Assessment Framework for data protection and privacy rights. Deliverable D1 of the EU PIAF Project – Prepared for the European Commission Directorate General Justice. Brussels.

Download references

Acknowledgements

We would like to thank the reviewers of an earlier ECIS conference article version as well as Dariusz Kloza and Gabriela Bodea for helpful comments on this article draft, the German Federal Institute of Information Security and the German Data Protection Authorities for their willingness to verify, discuss and publish the methodology presented in this article, Christian von Grone (CIO of Gerry Weber) and Pierre Blanc (IT Innovations Management at Carrefour) for helping us improve our constructs and apply them to the retail industry, Markus Sprafke for challenging our methodology for the automotive RFID industry, Wolf Rüdiger Hansen for integrating our methodology into PIA workshops for the RFID industry and finally Julian Cantella for editing the English version of this paper.

Author information

Authors and Affiliations

Authors

Appendices

Appendix A

Privacy targets and how they address activities that can create harm

To create a comprehensive overview of privacy targets, we incorporated all relevant regulatory frameworks, at least from a European point-of-view. The resulting list of privacy targets as presented in Table A1 are based on:

Table A1 Privacy targets and how they address activities that can create harm
  • all elements of the current and proposed EU data protection regulation (EC, 1995; EC, 2012);

  • all data protection principles included in the OECD privacy guidelines (OECD, 1980) and Fair Information Practice Principles (FTC, 1998);

  • all elements of the ISO/IEC Privacy Architecture Framework (ISO, 2011);

  • data protection targets proposed by Rost & Bock (2011) that emphasise individuals’ information self-determination rights.

These mostly European legal privacy ‘targets’ are then evaluated concerning their impact on ‘harming activities’ as identified in the American legal system. We ask whether privacy harm is likely to occur if the privacy targets are effectively tackled. The harmful activities in the table originate from Solove’s taxonomy of privacy (Solove, 2006), which offers the most comprehensive and structured view on this matter. The concepts that the activities in the table are based on are restricted to the context of information privacy. For example, decisional interference (P5.4) is restricted to the consideration of decisions that are derived from collected data. In contrast, governmental interference, which normally incorporates bodily and territorial privacy, is excluded from this table.

Three independent privacy experts judged the relationship between privacy targets and harmful activities. Where such a relationship is given, the intersection is marked. PIA assessors can use the table’s judgements to determine whether to consider a privacy target in their context.

Because P4.3 is functionally only an extension of P4.1 and P4.2, it is the only target that is not explicitly assigned to any of the activities.

When conducting a PIA, it might be helpful to choose the activities that are relevant for the system or business case, then identify the privacy targets to consider.

Appendix B

Analysis of the relative utility of the proposed PIA methodology

After evaluating the methodology’s absolute utility using action research, we evaluate the methodology’s relative utility by comparing it with other risk assessment approaches.

Although PIAs are only beginning to be used in practice, there are some proposals on how to conduct PIAs. An extensive review of the proposed methodologies can be found in Clarke (2011) as well as in the first delivery of the PIAF project that was conducted for the European Commission (Wright et al, 2011). None of the existing PIA approaches has become a recognised standard to date. However, one of the most heralded PIAs in Europe is the UK PIA Handbook (ICO, 2009). We therefore want to compare this UK PIA to our approach. Secondly, we want to compare our proposed methodology with a recognised standard. For this purpose, we use the ISO 31000 risk management standard (ISO, 2009), which describes how to generally handle risks in organisations. The reason why we compare our PIA methodology with this privacy-independent standard is that privacy is just one of several organisational risks. If an organisation regularly addresses privacy as an ethical risk, privacy must become part of an organisation’s overall risk management processes. PIAs and our approach should therefore fit into such processes.

Before we dive into the detailed comparison of the different approaches, we must discuss the differences in perspective between the three risk assessments. Both UK PIA and ISO 31000, like other current methodologies, consider a project as their subject of analysis. Our methodology, in contrast, focuses on systems. We consciously adopt a system-centric perspective for three reasons: First, our PIA methodology aims to lead a project team into privacy-by-design for a new system. Therefore we concentrate more on the concrete risks of a system’s design and less on the organisational framework of a new project. We take an engineering perspective that is supported by business processes where needed. Because the other two approaches embrace organisational risk reflections, they are detached from system design. The project-centric perspective also makes assessors operate within a project’s scope. However, in doing so they can overlook the larger context for the systems. For example, personal data flows may go beyond systems considered in the immediate project scope. The flows may therefore be defined as outside of project scope even though they are highly relevant from a privacy perspective. Finally, IT manufacturers develop systems that are initially independent of deployment. Since they should use PIAs in their system development lifecycle it also makes sense to focus on a system.

To compare our PIA methodology to the UK PIA Handbook and the ISO 31000 standard, we use seven PIA quality criteria that were recently published by Wright & De Hert (2012):

  1. 1

    Early start: A PIA process should start as early as possible so that it can influence the design of a project.

  2. 2

    Project description: A project subjected to a PIA should be adequately described, including: (1) a general description of the project, (2) information flows and (3) requirements of legal data protection instruments and other types of privacy.

  3. 3

    Stakeholder consultation: An organisation should identify stakeholders, inform them about the project, seek their views and duly take them into consideration.

  4. 4

    Risks management: The assessor should identify, assess and mitigate all possible risks resulting from a project using (1) risk assessment and (2) risk mitigation approaches.

  5. 5

    Legal compliance check: The assessor should ensure that the project complies with any legislative or other regulatory requirements.

  6. 6

    Recommendation and report: The assessor should (1) provide recommendations and an action plan, (2) justify decisions about and implementation of recommendations and (3) provide a PIA report.

  7. 7

    Audit and review: PIA reports should be audited or reviewed externally.

All three methodological approaches (UK PIA, ISO 31000 and our PIA) agree that an assessment should start early. The UK PIA links the assessment to a project lifecycle and recommends that the assessment begin in the initiation phase of a project; it also requires a cyclical approach, meaning that the different phases of the PIA can be re-executed at any time. ISO 31000 sees risk management as an integral part of organisational processes and asks for a ‘systematic, timely and iterative approach that is responsive to change’ (ISO, 2009). Like the two other approaches, our PIA methodology is linked to a process; our methodology is linked to a system’s development process, which ensures a systematic and iterative course of action. Timeliness is ensured because our PIA starts at the beginning of system development or when a system is upgraded.

Regarding a description of the overall project and system, both UK PIA and ISO 31000 require a general description of the project. The UK PIA Handbook requires a project outline and project plan. ISO 31000 requires a description of the internal organisational context, such as organisational objectives and attitudes towards risk, as well as the context for the risk management process. Our proposed methodology is much more specific because it focuses on in the aspects of a system that raise concrete privacy risks. Our proposed methodology demands a more systematic system characterisation in Step 1, requiring the assessor to take four distinct views on the system and its IT infrastructure context (system, functional, data, physical). Because it is system-centred, our approach does provide less information about the general project and organisational context. We believe that companies can achieve privacy-by-design in a more cost-effective way by focusing on the immediate risks inherent in a system. However, because privacy-by-design includes governance measures and strategic decisions on personal data asset handling, we acknowledge that our PIA approach should be complemented by reflections on organisational privacy risk attitudes and risk responsibilities. Such reflections can take place before our PIA process begins and can inform later judgements and reports.

Regarding the description of data flows, our system-centric perspective gives us an advantage over the two other approaches. We explicitly introduce a ‘data view’ that requires data categories and data flow diagrams of internal and external data flows, including actors and data types. In contrast, UK PIA’s phase 1 contains a background paper that can describe flows of personal information. ISO 31000’s internal context simply contains information flows without further detail.

The recommended description of privacy requirements cannot be found in ISO 31000 because it is a general risk management standard and not privacy-specific. In contrast, the UK PIA Handbook extensively explains the concept of privacy and describes four aspects of privacy that could be considered in a PIA: privacy of personal information, privacy of the person, privacy of personal behaviour, privacy of personal communications. This categorisation of privacy in four spheres is very intuitive and helps readers understand the ‘chameleon like’ privacy concept (Solove, 2006). We take a different approach though. In our methodology’s ‘definition of privacy targets’, we list privacy targets that should be used by engineers as their privacy design goals. Our targets are both more concrete and more extensive than the UK PIA’s four categories. In contrast to the UK PIA, we also ensure that European legal requirements are covered. Furthermore, in our approach assessors must describe and analyse each privacy target against the background of their respective context. For these reasons, we consider our methodology to be more anchored in the concept of privacy and more practical to apply.

The third quality criterion, stakeholders’ consultation, is part of all three assessment approaches. The UK PIA Handbook analyses stakeholders and establishes a consultative group during its preparation phase. The Handbook also involves stakeholders in its consultation and analysis phase. ISO 31000 contains an activity called ‘communication and consultation’ that involves communication with internal and external stakeholders and a consultative team approach. The precise input demanded from stakeholders is not specified in these two approaches. Although we again include a less-detailed description of organisational structures, we do include stakeholders in our approach and give them a concrete role. In Step 3, for example, where the protection demand for different privacy targets is evaluated, we explicitly recommend involving stakeholders.

For risk management, the UK PIA Handbook does not provide any specific guideline. Its consultation and analysis phase contains only three generic cues: risk analysis, identification of problems and search for solutions. It does not specify how these activities should be concretely pursued. ISO 31000’s risk assessment includes three activities that offer detailed recommendations: risk identification, risk analysis and risk evaluation. All three activities are reflected in our methodology. First, ISO 31000 proposes that an organisation apply risk identification tools and techniques that are suited to its objectives and capabilities. In terms of techniques, we chose damage scenarios and considerations of an operator and data subject perspective (Step 3), as well as a systematic identification of threats for each target that uses a numbering scheme (Step 4). Second, ISO 31000 recommends that organisations consider the likelihood that threats will occur and analyse risk qualitatively, semi-quantitatively or quantitatively. Step 4 of our methodology requires that organisations determine the likelihood that each threat will occur. For both the likelihood of a threat and the data protection demand, we chose a qualitative approach. We accept qualitative judgements in our approach because human privacy risks are often harder to describe or quantify than the loss of an asset.

Risk mitigation is described in ISO 31000’s activity ‘risk treatment’, which involves selecting risk treatment options, balancing costs against benefits and considering stakeholders. Our methodology treats this aspect of mitigation in Steps 5 (identification and recommendation of controls suited to protect against threats) and 6 (assessment and documentation of residual risks). In contrast to ISO 31000, we treat risk mitigation more extensively. We do so not only by specifically addressing privacy concerns but also by explaining how the mitigation is applied; in our methodology, organisations systematically identify controls for all likely threats, use a numbering scheme, define three levels of rigour, match rigour and protection demand, and provide an implementation plan. We address what ISO 31000 calls for but do so in a more detailed manner.

Considering the fifth quality criterion, it is not entirely clear whether Wright & De Hert (2012) and De Hert et al (2012) recommend an actual legal compliance check, which is conducted later and in addition to a PIA (ICO, 2009), or whether they recommend ensuring legal compliance as part of the assessment. All three approaches aim to achieve compliance with legal and regulatory requirements. The UK PIA Handbook and our methodology emphasise that privacy implications are a fast moving target in the changing technical world and that legal privacy targets may often not address all ensuing challenges.

All three assessment approaches require organisations to provide recommendations and justify their implementation. In the UK PIA’s consultation and analysis phase, organisations create an issues register that lists the avoidance measures that were considered, explains why they were rejected or adopted, and identifies any that are not addressed. For ISO 31000’s activity ‘risk treatment’, organisations select risk treatment options that balance costs against benefits, recognise stakeholder views, and prepare and implement a risk treatment plan. Similar to these approaches, Step 5 of our methodology requires that organisations recommend controls; however, organisations must also choose a justified level of rigour for each control to meet the level of protection demand identified in Step 3. Furthermore, we offer an extensive list of exemplary technical and non-technical controls for each potential privacy risk. Again, our methodology provides greater specificity, practical applicability and step-by-step guidance. A cost-benefit analysis, a control implementation plan and the documentation of residual risks are then required in our Step 6. All three approaches require documentation of the assessment process. The UK PIA and our methodology call it a PIA report and recommend publishing it. Our PIA goes a step further and offers a set of concrete content elements that a PIA report can include for different target audiences (Step 7).

Finally, only our proposed methodology meets the last quality criterion, which recommends that a PIA report should be externally audited and reviewed. To facilitate external audit and review of the PIA report, we recommend creating a machine-readable PIA report.

Table B1 shows that our PIA methodology is the most advanced of all three approaches with respect to the given quality criteria for a best practice PIA process.

Table B1 Overview of the analysis of the relative utility

Rights and permissions

Reprints and permissions

About this article

Cite this article

Oetzel, M., Spiekermann, S. A systematic methodology for privacy impact assessments: a design science approach. Eur J Inf Syst 23, 126–150 (2014). https://doi.org/10.1057/ejis.2013.18

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1057/ejis.2013.18

Keywords

Navigation