Iso Iec 27557-2022
Iso Iec 27557-2022
STAN DARD 2 75 5 7
Fi rs t ed ition
2 0 2 2 -11
Reference numb er
I SO /I E C 2 7 5 5 7 : 2 0 2 2 (E )
© I S O/I E C 2 0 2 2
ISO/IEC 27557: 202 2(E)
© I S O/I E C 2 0 2 2
All rights reserved. Unless otherwise specified, or required in the context o f its implementation, no part o f this publication may
be reproduced or utilized otherwise in any form or by any means, electronic or mechanical, including photocopying, or posting on
the internet or an intranet, without prior written permission. Permission can be requested from either ISO at the address below
or ISO’s member body in the country o f the requester.
ISO copyright o ffice
CP 401 • Ch. de Blandonnet 8
CH-1214 Vernier, Geneva
Phone: +41 22 749 01 11
Email: [email protected]
Web s ite: w w w. i s o . o rg
Published in Switzerland
Contents Page
Foreword .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . iv
Introduction . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . v
1 Scope . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. 1
2 Normative references . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . 1
5 Framework . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . 2
5.1 General . . . . .. . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . 2
5.3 Integration . . . . . . .. . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . 3
5.4 Design . . . . . .. . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . 3
6.3.1 General . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . 5
6.4.1 General . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . 6
6.5.1 General . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . 10
Annex D (in formative) Template showing the severity scale for privacy impacts on
individuals .. . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . 18
Bibliography . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . 19
Foreword
ISO (the International Organization for Standardization) and IEC (the International Electrotechnical
Commission) form the specialized system for worldwide standardization. National bodies that are
members o f ISO or IEC participate in the development o f International Standards through technical
committees established by the respective organization to deal with particular fields o f technical
activity. ISO and IEC technical committees collaborate in fields o f mutual interest. Other international
organizations, governmental and non-governmental, in liaison with ISO and IEC, also take part in the
work.
The procedures used to develop this document and those intended for its further maintenance
are described in the ISO/IEC Directives, Part 1. In particular, the di fferent approval criteria
needed for the di fferent types o f document should be noted. This document was dra fted in
accordance with the editorial rules o f the ISO/IEC Directives, Part 2 (see www. iso . org/direc tives or
www.iec.ch/members_experts/re fdocs) .
Attention is drawn to the possibility that some o f the elements o f this document may be the subject
o f patent rights. ISO and IEC shall not be held responsible for identi fying any or all such patent
rights. Details o f any patent rights identified during the development o f the document will be in the
Introduction and/or on the ISO list o f patent declarations received (see www.iso.org/patents) or the I EC
list o f patent declarations received (see https://patents.iec.ch) .
Any trade name used in this document is in formation given for the convenience o f users and does not
constitute an endorsement.
For an explanation o f the voluntary nature o f standards, the meaning o f ISO specific terms and
expressions related to con formity assessment, as well as in formation about ISO's adherence to
the World Trade Organization (WTO) principles in the Technical Barriers to Trade (TBT) see
www. iso. org/iso/foreword. htm l . In the IEC, see www.iec.ch/understanding-standards .
This document was prepared by Joint Technical Committee JTC 1, Information technology, Subcommittee
SC 27, Information security, cybersecurity and privacy protection .
Any feedback or questions on this document should be directed to the user’s national standards
body. A complete listing o f these bodies can be found at www. iso. org/memb ers . htm l and
www.iec.ch/national-committees .
Introduction
There is a growing interest in and need to address the di fferences between in formation security
risk management and privacy risk management for organizations processing personally identifiable
in formation (PII). In formation security risk management and related risk assessments have traditionally
focused on risk to an organization, o ften using the widely accepted formula o f risk = impact x likelihood.
Organizations can use various methods to assess and rank impacts and likelihood, and then determine
a value (qualitative or quantitative) for organizational risk that can be used to prioritize risk mitigation.
Conversely, privacy assessments have primarily been focused on impacts on individuals, such as those
identified through a privacy impact assessment. Although privacy assessments may prioritize the
impacts on an individual’s privacy, it is nonetheless necessary to consider how such privacy impacts on
an individual can contribute to overall organizational risk. Doing so can help organizations build trust,
implement technical and organisational measures, improve communication and support compliance
with legal obligations, while avoiding negative impacts to reputation, bottom lines, and future prospects
for growth. Privacy events may have consequences for the organization, even in the absence o f adverse
impacts on PII principals.
This document o ffers a framework for assessing organizational privacy risk, with consideration o f the
privacy impact on individuals as a component o f overall organizational risk. It extends the guidelines
o f ISO 31000:2018 to include specific considerations for organizational privacy risk and supports the
requirement for risk management as required by privacy in formation management systems (such as
I S O/I E C 2 7701) .
This document is intended to be used in connection with ISO 31000:2018. Whenever this document
extends the guidance given in ISO 31000:2018, an appropriate re ference to the clauses o f ISO 31000:2018
is made followed by privacy-specific guidance. The clause structure o f ISO 31000:2018 is mirrored in
this document and amended by sub-clauses i f needed.
1 Scope
This do c ument provide s guidel i ne s for organ i z ationa l pri vac y ri sk management, e xtende d from
I S O 3 10 0 0 : 2 01 8 .
This do c u ment provide s gu id ance to orga ni z ation s for i ntegrati ng ri s ks rel ate d to the pro ce s s i ng
o f p ers ona l ly identi fiable i n formation (PI I ) as p ar t o f an organ i z ationa l privac y ri s k ma nagement
pro gra m me . I t d i s ti ngu i she s b e twe en the i mp ac t that pro ce s s i ng PI I c an have on an i nd ividua l with
con s e quence s for organ i z ation s (e . g. reputationa l damage) . I t a l s o provide s gu idance for i ncorp orati ng
— organ i z ationa l con s e quence s o f privac y events that da mage the organ i z ation (e . g. b y ha rm i ng its
T h i s do c u ment a s s i s ts i n the i mplementation o f a ri sk-b a s e d privac y pro gram wh ich c an b e i nte grate d
T h i s do c u ment i s appl ic able to a l l typ e s and s i ze s o f orga ni z ation s pro ce s s i ng PI I or developi ng pro duc ts
and s er vice s that c an b e u s e d to pro ce s s PI I , i nclud i ng publ ic and pri vate comp a n ie s , govern ment
2 Normative references
T he fol lowi ng do c u ments are re ferre d to i n the tex t i n s uch a way th at s ome or a l l o f thei r content
con s titute s re qu i rements o f th i s do c u ment. For date d re ference s , on ly the e d ition cite d appl ie s . For
u ndate d re ference s , the late s t e d ition o f the re ference d do c ument (i nclud i ng a ny amend ments) appl ie s .
I S O and I E C mai nta i n term i nolo g y datab a s e s for u s e i n s ta nda rd i z ation at the fol lowi ng add re s s e s:
3 .1
privacy information management s ystem
PIMS
in formation security management system which addresses the protection o f privacy as potentially
a ffected by the processing o f personally identifiable in formation (PII)
[SOURCE: ISO/IEC 27701:2019, 3.2 modified — the abbreviated term "PII" is expanded as "personally
identifiable in formation".]
3.2
privacy event
occurrence or change o f a particular set o f circumstances related to personally identifiable in formation
(PII) processing that can cause a privacy impact (3 . 3 ) or consequence
3.3
privacy impact
element that has an e ffect on the privacy o f a personally identifiable in formation (PII) principal and/or
group o f PII principals
Note 1 to entry: The privacy impact could result from the processing o f PII in con formance or in violation o f
privacy sa feguarding requirements.
[SOURCE: ISO/IEC 29134:2017, 3.6, modified — "anything" replaced by "element".]
3 .4
consequence
outcome o f an event a ffecting organizational objectives
[SOURCE: ISO 31000:2018, 3.6, modified — “organizational” added and notes to entry removed]
The guidance in ISO 31000:2018, Clause 4 and the following additional guidance applies.
For organizational privacy risk management, PII principals should be included as stakeholders, and the
actual or potential adverse impact on them should be included when considering risks. Additionally,
the organization should consider the potential negative e ffect on the opinions and attitudes o f these
stakeholders related to the organization should these adverse impacts occur.
Organizations should identi fy the norms, societal values, and legal expectations related to individuals’
privacy given their cultural context(s). Privacy is a broad and shi fting concept that can be filtered
through cultural diversity and individual di fferences. These cultural factors can in form the
identification, evaluation, and treatment o f privacy risks.
5 Framework
5 .1 General
The guidance in ISO 31000:2018, 5.2 and the following additional guidance applies.
Top management should be aware o f privacy issues in order to success fully incorporate privacy
considerations into an overall organizational risk management process. This should include awareness
o f such topics as:
— privacy regulations and laws applicable to the organization;
— u n ique concern s , ri s ks , vu l nerabi l itie s , i mp ac ts , a nd organ i z ationa l con s e quence s rel ate d to privac y
a nd pro ce s s i ng PI I .
Where a n orga n i z ation i mplements a privac y i n formation ma nagement s ys tem (PI M S ) a s s p e ci fie d i n
I S O/I E C 2 7 701 , the orga n i z ation shou ld b e aware o f and com m itte d to i ntegrati ng the organ i z ationa l
5 .3 Integration
Top management a nd overs ight b o d ie s s hou ld en s ure that organ i z ationa l privac y ri s k ma nagement
is i ntegrate d i nto the organ i z ation’s s truc tu re, i nclud i ng p e ople, pro ce s s es , a nd te ch nolo g y. T he
i nte gration dep end s on the op erati ng pro ce s s e s o f the organ i z ation . Where an organ i z ation i mplements
a PI M S , the organ i z ationa l privac y ri sk ma nagement pro ce s s s hou ld b e i nte grate d i nto the releva nt
as p e c ts o f the PI M S .
5 .4 Design
T he gu idance i n I S O 3 10 0 0 : 2 01 8 , 5 .4.1 and the fol lowi ng add itiona l gu idance appl ie s .
When an orga n i z ation pro ce s s e s PI I , or is developi ng pro duc ts or s er vice s th at pro ce s s PI I , the
orga ni z ation shou ld as s e s s its role relate d to pro ce s s i ng PI I (e . g. control ler, pro ce s s or, j oi nt control ler,
manu fac tu rer, s o ftware develop er, provider o f pro duc ts that pro ce s s PI I ) .
Unders tand i ng the organ i z ation’s role relative to pro ce s s i ng PI I i s c ritic a l for the e ffe c tive de s ign o f
a ri s k management framework, i nclud i ng acc urately identi fyi ng and tre ati ng privac y ri s ks to the
orga ni z ation .
Where an orga ni z ation i mplements a PI M S , th i s contex t shou ld a l ign with the contex t o f the ma nagement
T he gu idance i n I S O 3 10 0 0 : 2 01 8 , 5 .4. 3 a nd the fol lowi ng add itiona l gu ida nce appl ie s .
— identi fy i nd ividua l s who have the accou ntabi l ity and authority to manage ri s ks relate d to PI I
pro ce s s i ng;
— identi fy i nd ividua l s who have the accou ntabi l ity and authority to manage ri s ks rel ate d to privac y
events th at have d i re c t con s e quence s for the orga n i z ation, even when there a re no i mp ac ts on PI I
T he gu idance i n I S O 3 10 0 0 : 2 01 8 , 5 .4.4 and the fol lowi ng add itiona l gu idance appl ie s .
When a l lo c ati ng re s ou rce s for orga n i z ationa l privac y ri sk ma nagement, top management a nd overs ight
5 .5 Implementation
5 .6 Evaluation
5 .7 Improvement
5 .7.1 Adapting
6.1 General
T he gu ida nce i n I S O 3 10 0 0 : 2 01 8 , 6 . 2 and the fol lowi ng add itiona l g uidance appl ie s .
I n the contex t o f organ i z ationa l privac y ri s k management pro ce s s e s , the fol lowi ng a re e xample s o f
— pro duc t and s ys tem de s igners and develop ers , for go o d s and s er vice s th at ha nd le PI I ;
— s up er vi s or y authoritie s;
S ome j u ri s d ic tion s mandate p a r tic u lar typ e s o f con s u ltation s for s ome i n s tance s o f PI I pro ce s s i ng , s uch
as con s u ltation o f s up er vi s or y authoritie s . I n s uch c a s e s , the organ i z ation s hou ld identi fy its obl igation s
for con s u ltation s a nd demon s trate that it compl ie s with them i n a ti mely man ner.
6.3 .1 General
The guidance in ISO 31000:2018, 6.3.3 and the following additional guidance applies.
Organizational factors can be a source o f risk and can have consequences for the organization without
adversely a ffecting individuals (e.g. a public statement about privacy from top management that may
a ffect perceptions o f the organization).
6.3.4 Defining risk criteria
The guidance in ISO 31000:2018, 6.3.4 and the following additional guidance applies.
The organization should define the risk criteria that guide the outcomes o f the risk assessment results.
This may include what types o f measures are used (qualitative vs. quantitative), the formula or methods
used to determine the risk, and the management actions for levels o f risk.
In relation to organizational privacy risk, these criteria should include how privacy impact on
individuals will be defined and measured, as well as how the privacy impact on individuals' factors into
the organization’s overall risk calculation. Furthermore, risk-based assessments o f factors influencing
the organization directly, due to adverse privacy events that do not have impacts on PII principals,
should also be considered (Annex B provides some examples o f privacy events in Table B.1 and causes
o f privacy events in Table B.2). Potential criteria to be defined for organizational privacy risk should
include:
6.4.1 General
6.4.2 .1 General
The guidance in ISO 31000:2018, 6.4.2 and the following additional guidance applies.
There are a number o f approaches for identi fying risks in an organization. Two common ones are
the event-based approach and the asset-based approach. Organizations can utilize elements o f both
approaches to fit their operational context. 6.4.2 outlines the elements necessary for risk identification
related to the processing o f PII, as well as details on the event-based and asset-based approaches.
6 . 4 . 2 . 2 I d e n ti fi c a ti o n o f P I I p r o c e s s i n g
The organization should identi fy PII processing that falls within the scope o f the risk management
process as defined in 6 . 3 . 2 , including potential PII processing by products and services. Understanding
PII processing and its relative criticality and value is integral to identi fying risks and assessing the
privacy impact on individuals and the consequences for the organization.
Organizational privacy risk management has the following considerations when identi fying PII
pro ces s ing:
a) PII processing activities (or types o f PII processing activities) and PII processed (or types o f PII);
b) the assets on which they rely;
c) categories o f individuals whose PII is being processed (e.g. customers, employees);
d) purposes o f the PII processing;
e) individuals or personnel processing PII;
f ) role o f internal and external entities engaged in the PII processing.
For evaluation o f PII processing and the relation to privacy impact and consequences, see 6.4.2.7 and
6.4.3.3. For examples o f considerations related to the identification o f PII processing, see Annex A .
6.4.2 .3 Event-based approach
6.4.2 .3 .1 General
An event-based approach to identi fying risks is a high-level examination o f risk sources and the potential
scenarios that can play out based on those risk sources. Scenarios should be built by identi fying the
di fferent paths, within the system, that risk sources can use to reach the PII processing, the PII, and
their target objectives.
An asset-based approach to risk identification involves looking closely at organizational assets and
identi fying threats and vulnerabilities that can a ffect those assets.
Examining assets, along with potential threats and vulnerabilities, can help the organization with a
more detailed and targeted risk identification process.
Scenarios can be built by identi fying the di fferent paths within the systems and data flows (on which
the PII and PII processing activities rely), that risk sources can use to reach the PII processing, the PII,
and their target objectives.
6 . 4 . 2 . 5 I d e n ti fi c a ti o n o f e x i s ti n g c o n tro l s
The organization should identi fy controls relevant to processing PII. Controls may be previously
documented (in internal systems, procedures, audit reports, etc.) or identified during the risk
management activities. Those organizations implementing a PIMS can also re fer to the controls selected
as part o f the required statement o f applicability.
6 . 4 . 2 . 6 I d e n ti fi c a ti o n o f b i a s e s , a s s u m p ti o n s a n d b e l i e f
s
The organization should identi fy and evaluate the biases, assumptions, and belie fs o f those involved
with the PII processing or design o f products and services that process PII, as well as how these can
a ffect the data analytic inputs and outputs used in PII processing.
6 . 4 . 2 . 7 I d e n ti fi c a ti o n o f p r i va c y i m p a c ts a n d c o n s e q u e n c e s
As part o f organizational privacy risk management, the organization should identi fy privacy impacts
on individuals whose PII is being processed, as well as consequences to the organization itsel f. Impacts
or consequences are the results o f a privacy event.
Consequences for the organization necessarily di ffer from those o f privacy impacts to individuals.
Consequences to organizations can be, for example:
— investigation and repair time;
— customer abandonment o f products or services;
— harm to internal organizational culture;
— (work)time lost;
— opportunity lost;
— financial cost o f specific skills to repair the damage;
— impairment o f image reputation, and goodwill;
— regulatory fines;
— loss o f physical assets.
Privacy impacts on individuals are an externality for the organization. There fore, it can be di fficult for
the organization to exactly estimate the impact on each individual. Rather than speci fying each type
o f impact, this can be considered generally as an impact on an individual’s privacy, with the degree o f
impact being the critical element (see 6.4.3.3 and Annex D). Organizations that per form privacy impact
assessments as part o f a PIMS or to comply with regulatory or legal requirements can utilize these for
help in identi fying the privacy impacts.
Annex C provides examples o f privacy impacts in Table C.1 and consequences in Table C.2 .
6.4.3 .1 General
The guidance in ISO 31000:2018, 6.4.3 and the following additional guidance applies.
6.4.3 .2 Risk analysis approaches
Risk analysis approaches typically include a risk analysis process, a risk model, and an analytic
approach.
A risk analysis process describes the steps for carrying out a risk analysis (e.g. prepare for the analysis,
conduct the analysis, communicate analysis and results, and maintain the analysis).
A risk model defines the risk factors (e.g. threats, impacts, consequences, vulnerabilities, controls)
to be assessed and the relationships among those factors. I f not using a pre-defined risk model, the
organization defines which risk factors will be assessed and the relationships among them.
An analytic approach includes the analysis type (i.e. quantitative, qualitative, semiquantitative) and
analysis approach (i.e. threat-oriented, asset/impact-oriented, vulnerability-oriented), which help the
organization determine how, with what level o f detail, and in which form risk factors are to be analysed.
When assessing the privacy impacts and consequences identified in the risk assessment, the
organization should distinguish between an assessment o f consequences to the organization and a
privacy impact assessment for individuals.
Business analyses should determine the degree to which the organization can be a ffected by an adverse
e ffect o f PII processing or privacy events and consider elements including but not limited to:
— value (qualitative or quantitative) or criticality o f product or service;
— threats, vulnerabilities, and privacy events applicable to PII processing activities;
— tangible consequences such as immediate and medium-term costs (e.g. costs for the immediate
work around and for the final solution, fines);
— intangible consequences (e.g. to the brand), that can have long-term costs, loss o f customers;
— criteria used to establish the overall consequence (as determined in 6.3.4);
— existing protections and mitigating controls around PII (e.g. de-identification).
Privacy consequences reflect the extent to which there can be a direct adverse e ffect, or cost, to the
organization as a result o f a privacy event.
Privacy impact reflects the degree to which PII principals can be a ffected by privacy events. Privacy
impact analyses may be per formed as part o f the privacy risk assessment, or they can be standalone
processes (e.g. as part o f a PIMS or as required by regulation or law). The organization should consider
elements such as:
— types o f PII processed;
— amount o f PII processed;
— use o f the PII that is processed (purpose for the processing);
— protections and mitigating controls around PII (e.g. de-identification);
— jurisdictional and cultural environment o f the PII principal (which can a ffect how the relative impact
is determined) .
A fter considering the potential impacts on PII principals and consequences for the organization, the
criteria for per forming the risk assessment (6.3.4) can be used to estimate the level o f impact and
consequence o f each privacy event, in order to rank them.
An example o f a scale for privacy impacts is provided in Annex D and a template for impact level in
Table D.1 .
6.4.3 .4 Assessment of likelihood
The organization should assess the likelihood o f privacy events. Likelihood can be determined on a
qualitative or quantitative scale and should align to the criteria established as part o f 6.3.4. Likelihood
can be a ffected by (but not limited to):
— organizational factors (e.g. geographic location, the public perception about participating
organizations with respect to privacy);
— system factors [e.g. the nature and history o f individuals’ interactions with the system; visibility
o f data processing to individuals and third parties; types, severity, and number o f vulnerabilities;
frequency, severity, and pervasiveness o f threats; success (mitigation) or failure o f controls (6.4.2.4)];
or
The guidance in ISO 31000:2018, 6.4.4 and the following additional guidance applies.
When evaluating the results o f the privacy risk analysis against the established risk criteria, the
organization should consider the following when determining the response to risk:
— the degree o f acceptable risk, which can vary depending on the context, including the applicable
legislation;
— whether findings in the risk analysis a ffect the obligations to PII principals, including those
mandated by applicable legislation.
6.5 .1 General
6.5 .2 .1 General
The guidance in ISO 31000:2018, 6.5.2 and the following additional guidance applies.
When selecting risk treatment options, the organization should consider options not only in the context
o f their own organization but also in relation to the privacy o f individuals. For example, i f a risk is
identified based on a high impact on the individual and consequence to the organization, that risk may
be treated by enhanced privacy by design controls (e.g. encryption, pseudonymization), thus enhancing
privacy protections for individuals, as well as protecting the organization from reputational damage or
regulatory fines.
I f implementing a privacy in formation management system (PIMS), such as in ISO/IEC 27701, the
organization should use the risk treatment process to help guide the selection o f controls to be included
in the statement o f applicability.
6 . 5 . 2 . 2 R i s k m o d i fi c a ti o n
C ontrol s ca n b e adde d, remove d , or mo d i fie d to re duce the l i kel i ho o d o f ri sk to what i s accep table to the
orga ni z ation . C ontrol s appl ie d to privac y ri s k c an b e i n forme d b y lo c a l l aws and regu lation s , a s wel l a s
b e s t prac tice s . I S O/I E C 2 7 701 provide s control s that may b e s ele c te d b y a n organ i z ation to i mp lement a
PI M S b as e d on a ri s k appro ach .
T he organ i z ation c an cho o s e to avoid an identi fie d ri sk b y mo d i fyi ng or c a ncel l i ng ac tivitie s (or pla n ne d
ac ti vitie s) that cre ate the ri sk. I n the contex t o f pro ce s s i ng PI I , th i s c an i nclude, for e xample, cho o s i ng
Ri s k c an b e sh are d b e twe en the orga n i z ation and a n e xterna l p a r ty, wh ich c a n i nclude a bu s i ne s s
p a r tner, s ub contrac tor, or even a c u s tomer. I n the conte xt o f PI I pro ce s s ors , for example, the pro ce s s or
may o ffer a s er vice for wh ich the c u s tomer ha s control over cer tai n as p e c ts o f ri s k (e . g. whe ther to
enable enc r yp tion or re qui re two - fac tor authentic ation) . When s hari ng ri sk, the organ i z ation s hou ld
con s ider new or add itiona l ri sks c re ate d b y the e xterna l rel ation sh ip .
For exa mple, contrac ts are a me an s o f sh ari ng or tran s ferri ng ri s k to o ther organ i z ation s (s uch a s
th rough an i n s u rance p ol ic y) .
P rivac y ri s k tre atment plan s a nd the re s p on s ibi l itie s for thei r i mplementation s hou ld b e reviewe d a nd
approve d b y management.
T he organ i z ation s hou ld mon itor for cha nge s th at may a ffe c t privac y ri sk. T he s e ch ange s may
ne ce s s itate the review o f organ i z ationa l privac y ri s k management pro ce s s e s and ri s k tre atment pla n s
for change s or i mprovements . T he organ i z ation c an mon itor, for exa mple:
— the orga n i z ation’s bu s i ne s s envi ron ment (e . g. the i ntro duc tion o f new te ch nolo gie s) ;
— change s to ri sk tolerance;
— new privac y events , th re ats , a nd vu l nerabi l itie s d i s clo s e d to the orga n i z ation from i nterna l or
e xterna l s ou rce s;
T he gu ida nce i n I S O 3 10 0 0 : 2 01 8 , 6 .7 and the fol lowi ng add itiona l gu idance appl ie s .
B o th organ i z ation s a nd i nd ividua l s may ne e d to know how PI I i s pro ce s s e d i n order to manage privac y
ri s k e ffe c ti vely. T h rough i nterna l and e xterna l rep or ti ng on PI I pro ce s s i ng , a s s o ci ate d privac y ri sks , and
ri s k tre atments , the organ i z ation c an i ncre as e tra n s p arenc y, e s tabl i s h a more rel i able u nders tand i ng o f
O rga ni z ationa l privac y ri sks s hou ld be rep or te d to top management, i nclud i ng a s u m mar y ri sk
rep or t) .
Annex A
(informative)
P I I p ro c e s s i n g i d e n ti fi c a ti o n
Identi fying the specific PII and processing enables an organization to more accurately assess the
privacy impact on individuals and consequences to the organization itsel f should privacy events or
threats materialize. Several key elements can be inventoried to illustrate or map out the organization’s
PII processing and its context to support organizational privacy risk management:
— PII processing activities (e.g. collection, storage, alteration, retrieval, consultation, disclosure,
anonymization, pseudonymization, dissemination or otherwise making available, deletion, or
destruction);
— types o f PII (see Table A.1);
— purposes o f the PII processing (see Table A.2);
— PII processing environment (e.g. geographic location, internal cloud, third parties);
— assets on which PII processing activities rely (e.g. systems/products/services that process PII);
— categories o f individuals whose PII is being processed (e.g. customers, employees, prospective
employees, or consumers);
— individuals or personnel processing PII (e.g. the organization or third parties such as service
providers, partners, customers, and developers);
— role o f entities engaged in PII processing (e.g. internal or external).
Distinguishing types and uses o f PII can help organizations further refine privacy impacts and
consequences. Di fferent data types and uses can have higher or lower relative values or sensitivity
depending on the context, and organizations can choose to assign sensitivity “ratings” depending on
this context. Organizations can utilize published taxonomies to assist in identi fying types and uses o f PII
(for example, a taxonomy for cloud computing and distributed platforms is found in ISO/IEC 19944-1)
or use existing classification systems internal to the organization. Table A.1 provides examples o f PII
data types, while Table A.2 provides example uses o f PII data.
E xamples
Civil status, identity, identification data (social security or other identification number)
Contact data (phone number, address, e-mail, etc.)
Personal li fe (living habits, marital status, philosophical, political, religious and trade-
union views, sex li fe, health data, racial or ethnic origin, o ffences/convictions, etc. –
excluding sensitive or dangerous data)
Pro fessional li fe (résumé, education and pro fessional training, awards, etc.)
Economic and financial in formation (bank data, income, financial situation, tax situation,
e tc .)
E xamples
Provide a service or product (e.g. fulfilling an order, processing a bank deposit, business
process execution)
Improve a service or product (e.g. to increase the quality)
Personalize a service or product (e.g. to tailor to a particular user)
O ffer upgrades/upsell a service or product (e.g. for additional products or
services, or additional capabilities o f existing products or services)
Market/advertise/promote a service or product
Share (e.g. share or trans fer PII to one or more third-parties)
Design or develop a service or product
Develop data models or train AI systems
Prevent fraud
Annex B
(informative)
Example privacy events and causes
Table B.1 provides a non-exhaustive list o f privacy events that organizations that may create impacts or
consequences for an organization, regardless o f whether these items have an impact on the privacy o f
PII principals in a given situation.
Appropriation PII is used in ways that exceed an individual’s expectation or authorization (e.g.
implicit or explicit). Appropriation includes scenarios in which the individual would
have expected additional value for the use given more complete in formation or ne -
gotiating power. Privacy impacts due to appropriation can lead to loss o f trust, loss
o f autonomy, and economic loss.
D is tortion Inaccurate or misleadingly incomplete PII is used or disseminated. Distortion can
present individuals in an inaccurate, unflattering, or disparaging manner, opening
the door for stigmatization, discrimination, or loss o f liberty.
Induced disclosure Induced disclosure can occur when individuals feel compelled to provide in for-
mation disproportionate to the purpose or outcome o f the transaction. Induced
disclosure can include leveraging access or rights to an essential (or perceived
essential) service. It can lead to impacts such as discrimination, loss o f trust, or loss
o f autonomy.
Insecurity Lapses in PII security can result in various privacy impacts, including loss o f trust,
exposure to economic loss and other identity the ft-related harms, and dignity loss -
es .
Re-identification De-identified PII, or PII otherwise disassociated from specific individuals, becomes
identifiable or associated with specific individuals again. It can lead to impacts such
as discrimination, loss o f trust, or dignity losses.
Stigmatization PII is linked to an actual identity in such a way as to create a stigma that can cause
dignity losses or discrimination. For example, transactional or behavioural PII such
as the accessing o f certain services (e.g. food stamps or unemployment benefits) or
locations (e.g. health care providers) may create in ferences about individuals that
can cause dignity losses or discrimination.
Surveillance PII, devices, or individuals are tracked or monitored in a manner disproportion-
ate to the purpose. The di fference between benign PII processing and the privacy
event o f surveillance can be narrow. Tracking or monitoring may be conducted for
operational purposes such as cybersecurity or to provide better services, but it can
become surveillance when it leads to privacy impacts such as discrimination; loss
o f trust, autonomy, or liberty; or physical harm.
Unanticipated revela- PII reveals or exposes an individual or facets o f an individual in unexpected ways.
tion Unanticipated revelation can arise from aggregation and analysis o f large or di-
verse data sets. Unanticipated revelation can give rise to dignity losses, discrimina-
tion, and loss o f trust and autonomy.
Table B .1 (continued)
Unwarranted res tric- Unwarranted restriction includes not only blocking access to PII or services, but
tion also limiting awareness o f the existence o f PII or its uses in ways that are dis -
proportionate to operational purposes. Operational purposes may include fraud
detection or other compliance processes. When individuals do not know what PII
an entity has or can make use o f, they do not have the opportunity to participate
in decision-making. Unwarranted restriction also diminishes accountability as to
whether the data are appropriate for the entity to possess, or whether it will be
used in a fair or equitable manner. Lack o f access to PII or services can lead to pri-
vacy impacts in the loss o f sel f-determination category, loss o f trust, and economic
harm.
Table B.2 provides a non-exhaustive list o f potential causes o f privacy events that may create impacts
or consequences for an organization, regardless o f whether these items have an impact on the privacy
o f PII principals in a given situation. The indicated potential causes have been derived from privacy
principles listed in ISO/IEC 29100 and privacy threats listed in ISO/IEC TR 27550.
Potential causes
Description
of privacy events
Lack of lawfulness, fair- PII is not processed law fully or fairly, or in a hidden manner in relation to the data
ness and transparency subject.
Lack of purpose limita- PII is not collected for specified, explicit or legitimate purposes, or further pro -
tions cessed in a manner that is incompatible with those purposes.
Lack of data minimiza- The collection o f PII is not adequate, relevant or limited to what is necessary in
tion relation to the purposes for which they are processed.
Lack of accuracy PII is not accurate or, where necessary, kept up to date.
Lack of s torage limita- PII is kept in a form which permits identification o f data subjects for longer than is
tions necessary for the purposes for which the PII is processed.
Identification The link between an identity and a piece o f in formation is not su fficiently hidden,
so that identities can be revealed when some e ffort is invested. A similar scenario
is related to linkability: several items o f anonymized or pseudonymised data can be
linked together to reveal the identity o f a person. While anonymous data does not
match any particular person within a set o f people, this set can be narrowed down
to a single person when several data sets are combined.
Non-repudiation Data can be used to prove that a particular person has committed a particular
action.
D etection of exis tence Uninvolved parties can determine i f a particular item o f interest (such as a data
record, e.g. a criminal record, an action, an event) exists or not. The knowledge o f
the existence o f such an item can o ften damage the PII principal and be used to in fer
more in formation about a person.
Unawareness An internal person owns or discloses PII o f PII principals, without being aware o f
the nature o f the in formation, or o f the legal implications.
Annex C
(in fo rmative)
Table C .1 provide s non- exh au s tive exa mple s o f privac y i mp ac ts on i nd ividua l s th at c an ari s e from
privac y events .
Impact Description
Loss of self- determina- L o s s o f autonomy: I nclude s lo s i ng control o ver de ter m i n ation s ab out i n for m atio n
tion p ro ce s s i ng or i nterac tion s with s ys tem s/pro duc ts/s er vice s , a s wel l a s ne e d le s s
c i vic engagement.
quence s .
Table C . 2 provide s non- e xhau s tive example s o f privac y con s e quence s to the orga ni z ation that c an a ri s e
Impact Description
D irect business cos ts Re venue or p er for m a nce lo s s fro m c u s to mer ab a ndon ment or avoid a nce .
Harm to internal organ- I mp ac t o n c ap ab i l ity o f o rga n i z atio n/u n it to ach ie ve vi s io n/m i s s ion , i mp ac t o n p ro -
izational culture duc ti vity/emp lo ye e mo ra le s tem m i ng fro m co n fl ic ts with i nter n a l c u ltu ra l va lue s
or e th ics .
Annex D
(in fo rmative)
O rga ni z ation s c an u s e the template s hown i n Table D.1 a s a gu ide for e s ti mati ng the s everity o f privac y
i mp ac ts on i nd ividua l s . S everity o f privac y i mp ac ts c an var y gre atly dep end i ng on the contex t, i nclud i ng
c u ltu ra l va lue s and norm s . T he s p e c i fic contex t s u rrou nd i ng the pro ce s s i ng o f PI I c an b e help fu l i n
the fol lowi ng con s ideration s , and o thers as de s i re d, to comple te the conte xt colu m n a nd e s ti mate the
S ome contex tua l con s ideration s for e s ti mati ng the s everity o f privac y i mp ac ts to i nd ividua l s i nclude
(s e e a l s o 6 .4. 3 . 3 ) :
re s u lt i n a gre ater i mp ac t) ;
— nu mb er o f PI I pri ncip a l s;
— quantity o f PI I b ei ng pro ce s s e d .
Bibliography
[1] I S O/ I E C Security techniques — Extension to ISO/IEC 27001 and ISO/IEC 27002 for privacy
2 7 70 1 ,
ICS 35.030
Price based on 1 9 pages
© I S O/I E C 2 0 2 2 – Al l r ights re s er ve d