Privacy domains is 1 of 4 sections of the Privacy Maturity Assessment Framework (PMAF). There are 6 elements to assess.
Before you start
It’s helpful to read:
To complete your agency’s self-assessment, download and use the 2 forms.
1. Require a clear understanding of the purpose
Require a clear understanding of the purpose and necessity of the collection, use or sharing of personal information.
The Data Protection and Use Policy’s (DPUP's) Principles and Guidelines align strongly with good privacy practices. Agencies can adapt them for their context and the amount and type of personal information they collect and use.
If your agency’s privacy policies and practices align with DPUP Principles and Guidelines, then you do not need to update them at this time to achieve ‘managed’.
However, if your agency plans to:
- rewrite their privacy policies, they should reference DPUP’s Principles and Guidelines
- develop or review their policies, services or programmes, they should consider using the DPUP toolkit to help guide this work.
Criteria 1: Defining the purpose
Clarity of purpose is vital to determining whether an agency needs to collect personal information and, if so, what personal information is needed to meet the purpose.
It’s also vital to determine whether the ways in which an agency intends to use and share the information are lawful, appropriate and support public service values.
Clarity of purpose is the anchor for many other things, such as consent forms, privacy statements or notices, privacy impact assessments and more.
To help agencies clarify their purpose for collecting personal information for projects and business processes, the DPUP Purpose Matters Guideline provides useful guidance on defining purposes for collection, use or sharing of personal information.
The agency’s advice on defining the purpose for the collection, use or sharing of personal information is ad hoc or reactive.
The agency’s guidance on defining the purpose for the collection, use or sharing of personal information is focused on compliance and risk.
The agency’s guidance is appropriately aligned with DPUP’s good-practice advice to accurately define purposes for collection, use or sharing of personal information for projects and business processes.
Criteria 2: Identifying choices
Context can affect whether people have a choice about providing personal information.
DPUP’s Transparency and Choice Guideline provides useful guidance to help agencies consider whether they can offer choices to people when collecting their personal information.
People may have:
- no choice — for example, a person may be required by a specific statutory provision to provide personal information
- limited choice — for example, a person may need to accept that providing some level of personal information is an essential part of using the service in question
- some choices — for example, a person may choose to enrol in a support programme where they will have choices about what experiences they do or do not share, and with whom.
Sometimes there may be good reasons for not offering service users choices, for example, if it would undermine the purpose of the collection, or it’s just not possible to do so.
If your agency offers choices if and when appropriate, then ‘managed’ would be a suitable maturity level.
It’s unusual to offer any choices to service users, and if it is done, it’s ad hoc or reactive.
Steps to identify practical choices that service users may be given regarding the collection or use of their personal information are taken by individual initiatives.
Additional processes are explicitly applied to identify when and how choices may be offered or accommodated when appropriate, aligning with DPUP’s good-practice advice.
Criteria 3: Reducing personal information
Agencies should assess the purpose of collecting personal information to ensure they are collecting only what's needed. The over-collection of personal information can negatively affect people’s trust and confidence in the agency collecting it. Reducing the collection of personal information can also reduce the impact of privacy breaches.
- be clear about the outcomes to be achieved
- be clear about the method that will be used to achieve the outcomes
- consider in what context the information will be collected and used, and what this might imply for decisions about that collection or use.
‘Just in case’ or ‘we have it so let’s use it’ are not sufficient reasons to collect or use personal information.
DPUP’s Purpose Matters Guideline provides useful guidance to help agencies assess their purpose for collection and the kinds of information to be collected.
Any steps to reduce or eliminate the need for collection or use of personal information are applied on an ad hoc or reactive basis.
Steps to reduce or eliminate the need for the collection or use of personal information are taken by individual initiatives. Existing practice is rarely re-examined. It’s generally assumed that if information is being collected, it’s still reasonable to collect it.
When creating or updating a service or process, consideration is given to eliminating or reducing the need for personal information by ensuring that its collection, use and sharing are needed to accomplish the stated outcomes. Existing practice is not used as a justification for continued collection and use.
2. Ensure the use and storage of personal information
Ensure the use and storage of personal information protects against inappropriate access, use and modification, while also ensuring effective and efficient support for its intended use.
Privacy by Design is a design methodology that includes privacy as an essential priority of any product, service, system or process. Privacy is embedded throughout the product or service life cycle from design to disposal.
To implement and embed Privacy by Design, an agency’s privacy officer or team needs to work closely with the agency’s teams that develop and implement technology, whether hardware, software or web, that interacts with personal information.
‘ICT and digital teams’ is the term that is used to indicate the variety of teams that could be included. This is the process of engineering privacy into the agency’s systems which is the reason for the term ‘privacy engineering’.
Criteria 1: Implementing Privacy by Design
Privacy, ICT, information management and other responsible teams work in silos when building and updating processes, products and services.
Privacy, ICT, information management and other responsible teams have limited engagement when building and updating processes, products and services.
Privacy, ICT, information management and other responsible teams work together to incorporate Privacy by Design methodology and principles when building and updating processes, products and services.
Criteria 2: Implementing privacy engineering
Privacy and ICT and digital teams have no knowledge and understanding of using privacy engineering to address privacy considerations.
Privacy, ICT and digital teams may have some knowledge and understanding of using privacy engineering to address privacy considerations. When building and updating processes, products and services, individual initiatives work with privacy, ICT and digital teams to incorporate privacy engineering and privacy design strategies.
Privacy, ICT and digital teams have sufficient knowledge and understanding of using privacy engineering to address privacy considerations. When building and updating processes, products and services, privacy, ICT and digital teams work together to incorporate privacy engineering.
Criteria 3: Responding to high public interest
Apply suitable and relevant policies and practices for new or novel ways of using personal information that may attract specific public interest.
Things to consider
When considering how they use and store personal information, agencies need to keep in mind how privacy connects and intersects with a growing range of significant and sensitive topics, such as facial recognition technology and automatic decision-making, that can affect public trust and confidence about how people’s information is used.
Agencies need to take care in relation to these topics and may require a multidisciplinary or cross-agency approach and consideration of other policies. Privacy officers or teams have a key role to play in identifying, understanding and promoting good awareness of these topics across their organisations.
A growing body of advice, guidance and knowledge can help inform agencies about considerations related to these topics:
- DPUP is about respectful, trusted and transparent use of people’s data and information.
Data Protection and Use Policy (DPUP)
- This algorithm charter demonstrates a commitment to ensuring New Zealanders have confidence in how government agencies use algorithms.
Data.govt.nz — Algorithm charter for Aotearoa New Zealand
- This report has recommendations on how to make sure automated decision-making is used in ways that grow trust, increase equity and give effect to the Treaty.
Digital Council for Aotearoa — Towards trustworthy and trusted automated decision-making in Aotearoa
- This paper sets out the position of the Office of the Privacy Commissioner on how the Privacy Act regulates biometrics.
Office of the Privacy Commissioner — Biometrics and privacy
- Developing a Māori data governance model is a high priority initiative for the Government Chief Data Steward and Stats NZ, which will provide the New Zealand Government with a unique opportunity to develop an approach to data governance that reflects Māori needs and interests in data.
Data.govt.nz — Co-designing Māori data governance
- Te Mana Raraunga, the Māori Data Sovereignty Network, has principles to consider for trusted use of Māori data, including a Māori Data Audit Tool.
Te Mana Raraunga — Principles of Māori Data Sovereignty (PDF 92KB)
Te Mana Raraunga — Māori Data Audit Tool (PDF 149KB)
When considering or piloting uses of personal information that would attract high public interest, such as biometrics or automated decision-making, policies and practices are reactive and non-specific.
When considering or piloting uses of personal information that would attract high public interest, such as biometrics or automated decision-making, existing policies and practices have been adapted by individual initiatives to consider such forms of use.
When considering or piloting uses of personal information that would attract high public interest, such as biometrics or automated decision-making, specific policies and practices have been developed or identified to address concerns and consideration of such forms of use.
3. Make it easy for people to access
Make it easy for people to access and request correction to their information.
People may not understand what rights they have to see the personal information that has been collected about them, to ask for that information to be corrected or to express a preference as to how they’d like to access their information.
Ensuring that people understand these rights helps build public trust and confidence. Lack of this understanding may deter people from providing their personal information and receiving a service they need.
For people to act on these rights, the process to do so needs to be easy to understand and use. For an agency to respond to these requests, their systems and processes need to be able to support responding within the legislative timeframe.
When considering this element, remember that people requesting access to their information can include customers, clients, employees and anyone else whose personal information your agency holds, uses and manages.
Criteria 1: Having a process
The approach to responding to access requests is ad hoc or reactive, and it’s not easy for clients to find or understand how to do this.
Customers and clients can find a process to make an access request, but it’s not clear if they find it easy to use.
Customers and clients can easily find and understand the process to make an access request.
Criteria 2: Monitoring the process
Access request responses are done on an ad hoc basis with no systematic monitoring.
The agency has an access request process. The requesters and the agency have little visibility of whether the access requests responses are meeting the legislative requirements.
The agency has a people-centred access request process that aligns with DPUP’s good-practice advice on access to personal information.
The agency monitors and ensures that access request responses meet the legislative requirements and supports the agency’s reputation as an effective and trusted custodian of people’s personal information.
Criteria 3: Reviewing the process
Actions to improve the process for responding to access requests are ad hoc or reactive.
The agency relies on individual initiatives to enable timely responses to access requests by considering easy access and collation of personal information.
Information management and ICT system reviews explicitly include consideration of easy access and collation of personal information to enable timely responses to access requests.
4. Understand and assess privacy risks
Understand and assess privacy risks and manage commensurately.
An agency’s work to develop, implement and improve its privacy practices is best informed by a suitable understanding of its risk position, which in turn is dependent on a suitable understanding of the types of personal information it holds, why it’s collected, and how it’s used and shared.
This understanding needs to be based on a holistic picture of the agency’s holdings and activities, not only about specific projects and programmes of work.
Completing a data inventory or stocktake can be an important component of an effective privacy risk assessment. A data inventory or stocktake provides an agency with a comprehensive view of the personal information that the agency handles.
As privacy objectives are delivered and/or as the agency’s personal information holdings and activities change, updating and maintaining their privacy risk profile will help them consider what further actions need to be taken to improve privacy practices.
While an agency privacy risk assessment provides a snapshot of its current privacy risks as an organisation, a project privacy risk assessment — frequently known as a Privacy Impact Assessment (PIA) — considers the risks associated with a specific process, product or service.
Criteria 1: Knowing agency risks
Privacy risks are not assessed or are assessed for specific events and incidents.
Privacy risks are assessed based on little understanding and knowledge of personal information holdings and the collection, uses, sharing activities, and storage of personal information.
Privacy risks are assessed based on an understanding and knowledge of personal information holdings (for example, data inventory or stocktake), focusing on collection, uses, sharing activities, and storage.
Criteria 2: Managing agency risks
Agency privacy risk assessments, which provide a snapshot of an agency’s current privacy risks, are not done.
Agency privacy risk assessments, which provide a snapshot of an agency’s current privacy risks, are siloed within the privacy team and are not part of the agency’s overall risk assessment.
Agency privacy risk assessments, which provide a snapshot of an agency’s current privacy risks and how it will manage them as an organisation, are part of the agency’s overall risk assessment, and are conducted and reviewed periodically.
Criteria 3: Managing project risks
Project risk assessments, which are done to assess the privacy risk of new or updated processes, products or services, are done occasionally or not at all. The privacy team has little or no visibility of project privacy risks.
Project risk assessments are done to assess the privacy risk of new or updated processes, products or services. Oversight by the privacy team and associated lines of ownership and accountability are not clear.
Project risk assessments are done to assess the privacy risk of new or updated processes, products or services with the support of and oversight by the privacy team. They cover the whole information life cycle and have clear lines of ownership and accountability.
5. Reduce the impact of privacy breaches
Reduce the impact of privacy breaches and incidents through good privacy practices.
Managing privacy breaches begins with the 4 key steps of contain, assess, notify and prevent.
The effectiveness of these steps can be improved by:
- having clear roles and responsibilities in the incident management plan
- regularly testing the plan
- integrating the plan into business continuity plans.
Conducting table top exercises (a simulated privacy breach) to test and validate the plan’s activities will ensure that the plan will work as intended and familiarise the team with their role and responsibilities.
The impact of breaches can be reduced by having practices that reduce the collection and retention of personal information.
- Privacy incidents and breaches
- Purpose Matters: Assess purpose and only collect what is needed
- Archives New Zealand — Manage information
Criteria 1: Having a privacy incident register
The agency may have a privacy incident register and/or a privacy incident response plan. Neither are reviewed regularly.
The agency has a privacy incident register and a privacy incident response plan. Learning from privacy incidents and breaches is done by individual initiatives.
The agency has:
- a privacy incident register that is used by staff and/or privacy team
- a tested privacy incident response plan (including partners and third parties) that is integrated into its business continuity planning
- a process for learning from privacy incidents and breaches.
Criteria 2: Minimising collection of personal information
Consideration of whether personal information needs to be collected is based solely on compliance and risk assessments.
Consideration of whether personal information needs to be collected and whether there are alternative ways to accomplish the desired outcome may be done by individual initiatives. Little or no review of the personal information already being collected is done when updating a process, product or service.
The agency collects only personal information that is clearly linked to the desired outcome and investigates alternative ways to accomplish the desired outcome that eliminates or reduces the need for personal information.
Criteria 3: Retaining personal information
The retention and destruction of personal information is done on an ad hoc basis.
The agency has information policy and practices that include the retention and destruction of personal information.
The agency has, maintains and promotes information policy and practices that include the retention and destruction of personal information, and this destruction of personal information is authorised by the government’s Chief Archivist.
6. Enable personal information use, reuse and sharing
Enable personal information use, reuse and sharing to support a unified public service that provides the public with effective services.
The Privacy Act details when and how personal information can be shared with others. While this does not apply to non-personal information, it is good privacy practice to be respectful, trusted and transparent when using or sharing non-personal information.
Criteria 1: Having policies for sharing personal information
The Privacy Act and other related legislation have provisions that enable the sharing of personal information to ensure that agencies and people with a legitimate purpose can access information they need.
When building privacy awareness, culture, capability and practices, it's important that a range of teams understand the Privacy Act’s information sharing requirements and other legislation specific to their agency that may mandate how they can collect, use and/or disclose personal information.
To support teams’ understanding, agencies should have information sharing policies that are easy to understand and access. It’s also important that teams know who to contact within their agency for expert advice.
- Sharing personal information
- Office of the Privacy Commissioner — Information sharing
- Office of the Privacy Commissioner — Information privacy principle 11: Disclosure of personal information
- Office of the Privacy Commissioner — Disclosing personal information outside New Zealand
Decisions to reuse or share personal information are made operationally and on an ad hoc or reactive basis.
Individual initiatives decide whether and how to reuse or share personal information, and this is primarily seen as a risk-based decision.
Information management and privacy policies include enabling advice on how to appropriately use and share personal information when individuals can be identified.
These policies also refer to relevant external sources (for example, information to support tamariki wellbeing, information sharing under the Family Violence Act 2018).
Criteria 2: Understanding the use of non-personal information
People and communities often think of information they have supplied, or that is about them, as personal, even when it has been de-identified or anonymised and is being used in a non-personal form.
It can be particularly important to remember that, while the Privacy Act is concerned with the privacy of individuals, we live in a society where broader groups have legitimate privacy interests. The Privacy Act’s controls may fall away once personal information has been fully de-identified, but the remaining information could still be sensitive to, for example, whānau, hapū, iwi, other cultural groups or other societal groups.
The Privacy Act says people do not need to be told when their data or information will be used for ‘research and statistical purposes’ that will not or cannot identify them. However, it’s good privacy practice to be transparent about any purpose or use, even when people cannot be identified.
Things to consider
Agencies need to consider the connection between personal information and de-identified or anonymised information, and its uses.
- When collecting personal information from people or communities, it’s important that agencies are transparent about how it will be used for research and insights, and how it will benefit service users, whānau or communities.
- When developing policies, services or programmes that are using de-identified or anonymised information, it’s important to consider the people behind the data when it’s used or analysed. Policy teams and service designers need to recognise this data can still be considered personal by the people and communities it came from.
- When it’s not practical or possible to share personal information with the group or community it came from, it remains important to share the value of information and insights that were developed using their personal information in some non-identifiable form. This may include data and data sets, analyses, qualitative or quantitative information, statistics, research, reports or studies.
- DPUP and the data analysis, research and evaluation cycle
- DPUP summary for developing policies, services or programmes
- Sharing Value Guideline
- Data.govt.nz — Manage data
Sharing of non-personal information is ad hoc or reactive.
Individual initiatives take steps for the respectful, trusted and transparent use, reuse and sharing of non-personal information that does not identify individuals (for example, data and data sets, analysis, qualitative or quantitative information, statistics, research, reports or studies). Privacy and other relevant policies may contain little or no guidance on this topic.
Privacy and other relevant policies incorporate advice for the respectful, trusted and transparent use, reuse and sharing of non-personal information that does not identify individuals (for example, data and data sets, analysis, qualitative or quantitative information, statistics, research, reports or studies).