0% found this document useful (0 votes)
236 views26 pages

Iso Iec 27557-2022

Uploaded by

tung.ton
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
236 views26 pages

Iso Iec 27557-2022

Uploaded by

tung.ton
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

I NTE RNATI O NAL ISO/IEC

STAN DARD 2 75 5 7

Fi rs t ed ition

2 0 2 2 -11

Information security, cybersecurity


and privacy protection — Application
of ISO 3 1 000: 2 01 8 for organizational
privacy risk management

Sécurité de l’information, cybersécurité et protection de la vie


privée — Application de l'ISO 31 000:201 8 au management des
risques organisationnels liés à la vie privée

Reference numb er

I SO /I E C 2 7 5 5 7 : 2 0 2 2 (E )

© I S O/I E C 2 0 2 2
ISO/IEC 27557: 202 2(E)

COPYRIGHT PROTECTED DOCUMENT

© I S O/I E C 2 0 2 2

All rights reserved. Unless otherwise specified, or required in the context o f its implementation, no part o f this publication may
be reproduced or utilized otherwise in any form or by any means, electronic or mechanical, including photocopying, or posting on
the internet or an intranet, without prior written permission. Permission can be requested from either ISO at the address below
or ISO’s member body in the country o f the requester.
ISO copyright o ffice
CP 401 • Ch. de Blandonnet 8
CH-1214 Vernier, Geneva
Phone: +41 22 749 01 11
Email: [email protected]
Web s ite: w w w. i s o . o rg

Published in Switzerland

ii © I S O/I E C 2 0 2 2 – Al l r ights res er ve d


ISO/IEC 27557: 202 2(E)

Contents Page

Foreword .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . iv

Introduction . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . v

1 Scope . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. 1

2 Normative references . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . 1

3 Terms and definitions . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . 1


4 Principles of organizational privacy risk management . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . 2

5 Framework . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . 2
5.1 General . . . . .. . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . 2

5.2 Leadership and commitment . . . . . . . .. . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . 2

5.3 Integration . . . . . . .. . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . 3

5.4 Design . . . . . .. . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . 3

5.4.1 Understanding the organization and its context .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . 3

5.4.2 Articulating risk management commitment . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . 3

5.4.3 Assigning organizational roles, authorities, responsibilities and


accountabilities . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . 3

5.4.4 Allocating resources . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . 3

5.4.5 Establishing communication and consultation . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . 4


5.5 Implementation . . . . . .. . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . 4
5.6 Evaluation . . . . . . . .. .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . 4
5 .7 I mprovement . . .. . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . 4
5.7.1 Adapting . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. 4
5.7.2 Continually improving . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . 4
6 Risk management process . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . 4
General 6.1 . . . . .. . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . 4
6.2
Communication and consultation . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . 4
6.3
Scope, context and criteria . .. . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. 5

6.3.1 General . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . 5

6.3.2 Defining the scope . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . 5

6.3.3 External and internal context .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . 5

6.3.4 Defining risk criteria . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . . . . .. . 5

6.4 Risk assessment . . . . .. . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . 6

6.4.1 General . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . 6

6.4.2 Risk identification . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . .. . . . . . . . 6

6.4.3 Risk analysis . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . 9


6.4.4 Risk evaluation . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . .. . . . . . . . . .. . . . . . . . . .. 10

6.5 Risk treatment . . . . . . . .. .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . 10

6.5.1 General . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . 10

6.5.2 Selection o f risk treatment options . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . 10

6.5.3 Preparing and implementing risk treatment plans . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . .. . . . . . . . . .. . . . 11

6.6 Monitoring and review . .. . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . .. . . . . . . . . .. . . . . . . . . 11

6.7 Recording and reporting . . . . . . .. . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . .. . . . . . . . . 12

Annex A (in formative) PII processing identification . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . 13

Annex B (in formative) E xample privacy events and causes . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. 15

Annex C (in formative) Privacy impact and consequence examples .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . 17

Annex D (in formative) Template showing the severity scale for privacy impacts on
individuals .. . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . 18

Bibliography . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . 19

© I S O/I E C 2 0 2 2 – Al l r ights res er ve d i ii


ISO/IEC 27557: 202 2(E)

Foreword

ISO (the International Organization for Standardization) and IEC (the International Electrotechnical
Commission) form the specialized system for worldwide standardization. National bodies that are
members o f ISO or IEC participate in the development o f International Standards through technical
committees established by the respective organization to deal with particular fields o f technical
activity. ISO and IEC technical committees collaborate in fields o f mutual interest. Other international
organizations, governmental and non-governmental, in liaison with ISO and IEC, also take part in the
work.
The procedures used to develop this document and those intended for its further maintenance
are described in the ISO/IEC Directives, Part 1. In particular, the di fferent approval criteria
needed for the di fferent types o f document should be noted. This document was dra fted in
accordance with the editorial rules o f the ISO/IEC Directives, Part 2 (see www. iso . org/direc tives or
www.iec.ch/members_experts/re fdocs) .
Attention is drawn to the possibility that some o f the elements o f this document may be the subject
o f patent rights. ISO and IEC shall not be held responsible for identi fying any or all such patent
rights. Details o f any patent rights identified during the development o f the document will be in the
Introduction and/or on the ISO list o f patent declarations received (see www.iso.org/patents) or the I EC
list o f patent declarations received (see https://patents.iec.ch) .
Any trade name used in this document is in formation given for the convenience o f users and does not
constitute an endorsement.
For an explanation o f the voluntary nature o f standards, the meaning o f ISO specific terms and
expressions related to con formity assessment, as well as in formation about ISO's adherence to
the World Trade Organization (WTO) principles in the Technical Barriers to Trade (TBT) see
www. iso. org/iso/foreword. htm l . In the IEC, see www.iec.ch/understanding-standards .

This document was prepared by Joint Technical Committee JTC 1, Information technology, Subcommittee
SC 27, Information security, cybersecurity and privacy protection .
Any feedback or questions on this document should be directed to the user’s national standards
body. A complete listing o f these bodies can be found at www. iso. org/memb ers . htm l and
www.iec.ch/national-committees .

iv © I S O/I E C 2 0 2 2 – Al l r ights res er ve d


ISO/IEC 27557: 202 2(E)

Introduction

There is a growing interest in and need to address the di fferences between in formation security
risk management and privacy risk management for organizations processing personally identifiable
in formation (PII). In formation security risk management and related risk assessments have traditionally
focused on risk to an organization, o ften using the widely accepted formula o f risk = impact x likelihood.
Organizations can use various methods to assess and rank impacts and likelihood, and then determine
a value (qualitative or quantitative) for organizational risk that can be used to prioritize risk mitigation.
Conversely, privacy assessments have primarily been focused on impacts on individuals, such as those
identified through a privacy impact assessment. Although privacy assessments may prioritize the
impacts on an individual’s privacy, it is nonetheless necessary to consider how such privacy impacts on
an individual can contribute to overall organizational risk. Doing so can help organizations build trust,
implement technical and organisational measures, improve communication and support compliance
with legal obligations, while avoiding negative impacts to reputation, bottom lines, and future prospects
for growth. Privacy events may have consequences for the organization, even in the absence o f adverse
impacts on PII principals.
This document o ffers a framework for assessing organizational privacy risk, with consideration o f the
privacy impact on individuals as a component o f overall organizational risk. It extends the guidelines
o f ISO 31000:2018 to include specific considerations for organizational privacy risk and supports the
requirement for risk management as required by privacy in formation management systems (such as
I S O/I E C 2 7701) .

This document is intended to be used in connection with ISO 31000:2018. Whenever this document
extends the guidance given in ISO 31000:2018, an appropriate re ference to the clauses o f ISO 31000:2018
is made followed by privacy-specific guidance. The clause structure o f ISO 31000:2018 is mirrored in
this document and amended by sub-clauses i f needed.

© I S O/I E C 2 0 2 2 – Al l r ights res er ve d v


INTERNATIONAL STANDARD ISO/IEC 27557: 202 2(E)

Information security, cybersecurity and privacy


protection — Application of ISO 3 1 000: 2 01 8 for
organizational privacy risk management

1 Scope

This do c ument provide s guidel i ne s for organ i z ationa l pri vac y ri sk management, e xtende d from
I S O 3 10 0 0 : 2 01 8 .

This do c u ment provide s gu id ance to orga ni z ation s for i ntegrati ng ri s ks rel ate d to the pro ce s s i ng

o f p ers ona l ly identi fiable i n formation (PI I ) as p ar t o f an organ i z ationa l privac y ri s k ma nagement

pro gra m me . I t d i s ti ngu i she s b e twe en the i mp ac t that pro ce s s i ng PI I c an have on an i nd ividua l with

con s e quence s for organ i z ation s (e . g. reputationa l damage) . I t a l s o provide s gu idance for i ncorp orati ng

the fol lowi ng i nto the overa l l orga n i z ationa l ri s k a s s e s s ment:

— organ i z ationa l con s e quence s o f advers e privac y i mp ac ts on i nd ividua l s; and

— organ i z ationa l con s e quence s o f privac y events that da mage the organ i z ation (e . g. b y ha rm i ng its

reputation) without c au s i ng a ny advers e pri vac y i mp ac ts to i nd ividua l s .

T h i s do c u ment a s s i s ts i n the i mplementation o f a ri sk-b a s e d privac y pro gram wh ich c an b e i nte grate d

i n the overa l l ri s k management o f the orga ni z ation .

T h i s do c u ment i s appl ic able to a l l typ e s and s i ze s o f orga ni z ation s pro ce s s i ng PI I or developi ng pro duc ts

and s er vice s that c an b e u s e d to pro ce s s PI I , i nclud i ng publ ic and pri vate comp a n ie s , govern ment

entitie s , a nd non-pro fit organ i z ation s .

2 Normative references

T he fol lowi ng do c u ments are re ferre d to i n the tex t i n s uch a way th at s ome or a l l o f thei r content

con s titute s re qu i rements o f th i s do c u ment. For date d re ference s , on ly the e d ition cite d appl ie s . For

u ndate d re ference s , the late s t e d ition o f the re ference d do c ument (i nclud i ng a ny amend ments) appl ie s .

I S O 3 10 0 0 : 2 01 8 , Risk management — Guidelines


I S O/I E C 2 910 0 , Information technology — Security techniques — Privacy framework
I S O/I E C 2 70 0 0 , Information technology — Security techniques — Information security management

systems — Overview and vocabulary

3 Terms and definitions


For the pu rp o s e s o f th i s do c ument, the term s and defi n ition s given i n I S O 3 10 0 0 , I S O/I E C 2 910 0 a nd

I S O/I E C 2 70 0 0 a nd the fol lowi ng apply.

I S O and I E C mai nta i n term i nolo g y datab a s e s for u s e i n s ta nda rd i z ation at the fol lowi ng add re s s e s:

— I S O O n l i ne brows i ng pl at form: avai l able at http s:// www. iso . org/obp

— I E C E le c trop e d ia: avai lable at https:// www. ele c trop e d ia . org/

© I S O/I E C 2 0 2 2 – Al l r ights res er ve d 1


ISO/IEC 27557: 202 2(E)

3 .1
privacy information management s ystem
PIMS
in formation security management system which addresses the protection o f privacy as potentially
a ffected by the processing o f personally identifiable in formation (PII)
[SOURCE: ISO/IEC 27701:2019, 3.2 modified — the abbreviated term "PII" is expanded as "personally
identifiable in formation".]
3.2
privacy event
occurrence or change o f a particular set o f circumstances related to personally identifiable in formation
(PII) processing that can cause a privacy impact (3 . 3 ) or consequence
3.3
privacy impact
element that has an e ffect on the privacy o f a personally identifiable in formation (PII) principal and/or
group o f PII principals
Note 1 to entry: The privacy impact could result from the processing o f PII in con formance or in violation o f
privacy sa feguarding requirements.
[SOURCE: ISO/IEC 29134:2017, 3.6, modified — "anything" replaced by "element".]
3 .4
consequence
outcome o f an event a ffecting organizational objectives
[SOURCE: ISO 31000:2018, 3.6, modified — “organizational” added and notes to entry removed]

4 Principles of organizational privacy risk management

The guidance in ISO 31000:2018, Clause 4 and the following additional guidance applies.
For organizational privacy risk management, PII principals should be included as stakeholders, and the
actual or potential adverse impact on them should be included when considering risks. Additionally,
the organization should consider the potential negative e ffect on the opinions and attitudes o f these
stakeholders related to the organization should these adverse impacts occur.
Organizations should identi fy the norms, societal values, and legal expectations related to individuals’
privacy given their cultural context(s). Privacy is a broad and shi fting concept that can be filtered
through cultural diversity and individual di fferences. These cultural factors can in form the
identification, evaluation, and treatment o f privacy risks.

5 Framework

5 .1 General

The guidance in ISO 31000:2018, 5.1 applies.

5 .2 Leadership and commitment

The guidance in ISO 31000:2018, 5.2 and the following additional guidance applies.
Top management should be aware o f privacy issues in order to success fully incorporate privacy
considerations into an overall organizational risk management process. This should include awareness
o f such topics as:
— privacy regulations and laws applicable to the organization;

2 © I S O/I E C 2 0 2 2 – Al l r ights res er ve d


ISO/IEC 27557: 202 2(E)

— privac y obl igation s the organ i z ation h as to i nd ividua l s;

— how pro ce s s i ng PI I c a n i mp ac t i nd ividua l s;

— u n ique concern s , ri s ks , vu l nerabi l itie s , i mp ac ts , a nd organ i z ationa l con s e quence s rel ate d to privac y

a nd pro ce s s i ng PI I .

Where a n orga n i z ation i mplements a privac y i n formation ma nagement s ys tem (PI M S ) a s s p e ci fie d i n

I S O/I E C 2 7 701 , the orga n i z ation shou ld b e aware o f and com m itte d to i ntegrati ng the organ i z ationa l

privac y ri sk management ac tivitie s relate d to the releva nt a s p e c ts o f the PI M S .

5 .3 Integration

T he gu idance i n I S O 3 10 0 0 : 2 01 8 , 5 . 3 a nd the fol lowi ng add itiona l gu ida nce appl ie s .

Top management a nd overs ight b o d ie s s hou ld en s ure that organ i z ationa l privac y ri s k ma nagement

is i ntegrate d i nto the organ i z ation’s s truc tu re, i nclud i ng p e ople, pro ce s s es , a nd te ch nolo g y. T he

i nte gration dep end s on the op erati ng pro ce s s e s o f the organ i z ation . Where an organ i z ation i mplements

a PI M S , the organ i z ationa l privac y ri sk ma nagement pro ce s s s hou ld b e i nte grate d i nto the releva nt

as p e c ts o f the PI M S .

5 .4 Design

5 .4.1 Understanding the organization and its context

T he gu idance i n I S O 3 10 0 0 : 2 01 8 , 5 .4.1 and the fol lowi ng add itiona l gu idance appl ie s .

When an orga n i z ation pro ce s s e s PI I , or is developi ng pro duc ts or s er vice s th at pro ce s s PI I , the

orga ni z ation shou ld as s e s s its role relate d to pro ce s s i ng PI I (e . g. control ler, pro ce s s or, j oi nt control ler,

manu fac tu rer, s o ftware develop er, provider o f pro duc ts that pro ce s s PI I ) .

Unders tand i ng the organ i z ation’s role relative to pro ce s s i ng PI I i s c ritic a l for the e ffe c tive de s ign o f

a ri s k management framework, i nclud i ng acc urately identi fyi ng and tre ati ng privac y ri s ks to the

orga ni z ation .

Where an orga ni z ation i mplements a PI M S , th i s contex t shou ld a l ign with the contex t o f the ma nagement

s ys tem (I S O/I E C 2 7 701 : 2 019, 5 . 2 .1) .

5 .4.2 Articulating risk management commitment

T he gu idance i n I S O 3 10 0 0 : 2 01 8 , 5 .4. 2 appl ie s .

5 .4.3 Assigning organizational roles, authorities, responsibilities and accountabilities

T he gu idance i n I S O 3 10 0 0 : 2 01 8 , 5 .4. 3 a nd the fol lowi ng add itiona l gu ida nce appl ie s .

Top ma nagement a nd overs ight b o d ie s s hou ld:

— emph as i z e that ri s k management relate d to PI I pro ce s s i ng i s a core re s p on s ibi l ity;

— identi fy i nd ividua l s who have the accou ntabi l ity and authority to manage ri s ks relate d to PI I

pro ce s s i ng;

— identi fy i nd ividua l s who have the accou ntabi l ity and authority to manage ri s ks rel ate d to privac y

events th at have d i re c t con s e quence s for the orga n i z ation, even when there a re no i mp ac ts on PI I

pri nc ip a l s , employe e s or o ther s ta keholders .

5 .4.4 Allocating resources

T he gu idance i n I S O 3 10 0 0 : 2 01 8 , 5 .4.4 and the fol lowi ng add itiona l gu idance appl ie s .

© I S O/I E C 2 0 2 2 – Al l r ights res er ve d 3


ISO/IEC 27557: 202 2(E)

When a l lo c ati ng re s ou rce s for orga n i z ationa l privac y ri sk ma nagement, top management a nd overs ight

b o d ie s shou ld con s ider ne e d s s p e ci fic to privac y (e . g. i nterna l or e xterna l re s ource s with s p e c ia l i ze d

knowle dge, ski l l s , abi l itie s a nd tra i ni ng on privac y i s s ue s) .

5 .4.5 Establishing communication and consultation

T he gu ida nce i n I S O 3 10 0 0 : 2 01 8 , 5 .4. 5 appl ie s .

5 .5 Implementation

T he gu ida nce i n I S O 3 10 0 0 : 2 01 8 , 5 . 5 appl ie s .

5 .6 Evaluation

T he gu ida nce i n I S O 3 10 0 0 : 2 01 8 , 5 . 6 appl ie s .

5 .7 Improvement

5 .7.1 Adapting

T he gu ida nce i n I S O 3 10 0 0 : 2 01 8 , 5 .7.1 appl ie s .

5 .7.2 Continually improving

T he gu ida nce i n I S O 3 10 0 0 : 2 01 8 , 5 .7. 2 appl ie s .

6 Risk management process

6.1 General

T he gu ida nce i n I S O 3 10 0 0 : 2 01 8 , 6 .1 appl ie s .

6.2 Communication and consultation

T he gu ida nce i n I S O 3 10 0 0 : 2 01 8 , 6 . 2 and the fol lowi ng add itiona l g uidance appl ie s .

I n the contex t o f organ i z ationa l privac y ri s k management pro ce s s e s , the fol lowi ng a re e xample s o f

group s or i nd ividua l s that c an b e con s u lte d/com mu n ic ate d with:

— pri vac y e xp er ts;

— p ers on s i n cha rge o f privac y matters;

— pro duc t and s ys tem de s igners and develop ers , for go o d s and s er vice s th at ha nd le PI I ;

— PI I pro ce s s i ng s ys tem owners;

— o fficers or management re s p on s ible for PI I pro ce s s i ng ac tivitie s and de c i s ion s;

— s up er vi s or y authoritie s;

— PI I pri ncip a l s or group s o f PI I pri nc ip a l s (e . g. orga n i z ation s or a s s o ci ation s) .

S ome j u ri s d ic tion s mandate p a r tic u lar typ e s o f con s u ltation s for s ome i n s tance s o f PI I pro ce s s i ng , s uch

as con s u ltation o f s up er vi s or y authoritie s . I n s uch c a s e s , the organ i z ation s hou ld identi fy its obl igation s

for con s u ltation s a nd demon s trate that it compl ie s with them i n a ti mely man ner.

4 © I S O/I E C 2 0 2 2 – Al l r ights res er ve d


ISO/IEC 27557: 202 2(E)

6.3 Scope, context and criteria

6.3 .1 General

The guidance in ISO 31000:2018, 6.3.1 applies.

6.3.2 Defining the scope


The guidance in ISO 31000:2018, 6.3.2 and the following additional guidance applies.
The scope o f the organizational privacy risk management process should include:
— PII processing;
— products and services that can be used to process PII.
Where an organization implements a PIMS, the scope o f the risk assessment should reflect that o f the
defined scope o f the management system (ISO/IEC 27701:2019, 5.2.3).

6.3 .3 External and internal context

The guidance in ISO 31000:2018, 6.3.3 and the following additional guidance applies.
Organizational factors can be a source o f risk and can have consequences for the organization without
adversely a ffecting individuals (e.g. a public statement about privacy from top management that may
a ffect perceptions o f the organization).
6.3.4 Defining risk criteria
The guidance in ISO 31000:2018, 6.3.4 and the following additional guidance applies.
The organization should define the risk criteria that guide the outcomes o f the risk assessment results.
This may include what types o f measures are used (qualitative vs. quantitative), the formula or methods
used to determine the risk, and the management actions for levels o f risk.
In relation to organizational privacy risk, these criteria should include how privacy impact on
individuals will be defined and measured, as well as how the privacy impact on individuals' factors into
the organization’s overall risk calculation. Furthermore, risk-based assessments o f factors influencing
the organization directly, due to adverse privacy events that do not have impacts on PII principals,
should also be considered (Annex B provides some examples o f privacy events in Table B.1 and causes
o f privacy events in Table B.2). Potential criteria to be defined for organizational privacy risk should
include:

— how organizational consequences will be defined and measured;


— how privacy impact on individuals will be defined and measured;
— positive or negative consequences for the organization;
— positive and negative privacy impacts to PII principals.
In order to help with the decision process, the risk evaluation criteria should consider the necessary
balance between:
— opportunities for the organization;
— risks to the organization (consequences regarding the following reputation, fines, trials);
— risks to PII principals (privacy impacts on physical, material, non-material aspects).
For example, there can be a business opportunity that leads the organization to process new PII, with a
very low risk for PII principals, but a very high reputational risk to the organization.
© I S O/I E C 2 0 2 2 – Al l r ights res er ve d 5
ISO/IEC 27557: 202 2(E)

Table C.1 provides examples o f privacy impacts to individuals.


NOTE ISO 31000 notes that risk is usually expressed in terms o f risk sources, potential events, their
consequences and their likelihood. In this document, “consequences” re fers to consequences to the organization,
while “privacy impact” re fers to the impact on PII principals.

6.4 Risk assessment

6.4.1 General

The guidance in ISO 31000:2018, 6.4.1 applies.


6 . 4 . 2 R i s k i d e n ti fi c a ti o n

6.4.2 .1 General

The guidance in ISO 31000:2018, 6.4.2 and the following additional guidance applies.
There are a number o f approaches for identi fying risks in an organization. Two common ones are
the event-based approach and the asset-based approach. Organizations can utilize elements o f both
approaches to fit their operational context. 6.4.2 outlines the elements necessary for risk identification
related to the processing o f PII, as well as details on the event-based and asset-based approaches.
6 . 4 . 2 . 2 I d e n ti fi c a ti o n o f P I I p r o c e s s i n g

The organization should identi fy PII processing that falls within the scope o f the risk management
process as defined in 6 . 3 . 2 , including potential PII processing by products and services. Understanding
PII processing and its relative criticality and value is integral to identi fying risks and assessing the
privacy impact on individuals and the consequences for the organization.
Organizational privacy risk management has the following considerations when identi fying PII
pro ces s ing:

a) PII processing activities (or types o f PII processing activities) and PII processed (or types o f PII);
b) the assets on which they rely;
c) categories o f individuals whose PII is being processed (e.g. customers, employees);
d) purposes o f the PII processing;
e) individuals or personnel processing PII;
f ) role o f internal and external entities engaged in the PII processing.
For evaluation o f PII processing and the relation to privacy impact and consequences, see 6.4.2.7 and
6.4.3.3. For examples o f considerations related to the identification o f PII processing, see Annex A .
6.4.2 .3 Event-based approach

6.4.2 .3 .1 General

An event-based approach to identi fying risks is a high-level examination o f risk sources and the potential
scenarios that can play out based on those risk sources. Scenarios should be built by identi fying the
di fferent paths, within the system, that risk sources can use to reach the PII processing, the PII, and
their target objectives.

6 © I S O/I E C 2 0 2 2 – Al l r ights res er ve d


ISO/IEC 27557: 202 2(E)

6.4.2.3.2 Identification o f privacy events


As part o f an event-based approach, the organization should identi fy (potential) privacy events in order
to construct scenarios that help to identi fy risks. Privacy events may originate from the context within
which the PII processing takes place, and involve people, processes, and technology.
Common privacy events can o ften be found in catalogues or can be identified through interviews with
internal stakeholders, such as asset owners and employees with privacy responsibilities. Privacy events
that adversely a ffect the organization should be described by identi fying the di fferent stakeholders and
how the event a ffects them or is perceived by them.
Scenarios that help to identi fy privacy risks can be built by deduction by asking the following question:
— Given the PII processing identified in 6.4.2.2, what contextual considerations can exist that can
a ffect a privacy event?
Examples o f privacy events and their causes are described in Annex B, and contextual considerations
that can a ffect a privacy event are described in Annex D .
NOTE Privacy events can have consequences for the organization even in the absence o f adverse impacts on
PII principals.

6.4.2 .4 Asset-based approach

6.4.2 .4.1 General

An asset-based approach to risk identification involves looking closely at organizational assets and
identi fying threats and vulnerabilities that can a ffect those assets.
Examining assets, along with potential threats and vulnerabilities, can help the organization with a
more detailed and targeted risk identification process.
Scenarios can be built by identi fying the di fferent paths within the systems and data flows (on which
the PII and PII processing activities rely), that risk sources can use to reach the PII processing, the PII,
and their target objectives.

6.4.2.4.2 Identification o f PII processing assets


To accurately identi fy risks via asset-based scenarios, the organization should identi fy assets related to
PII processing. Assets can include, but are not limited, to:
— the PII itsel f;
— systems, so ftware applications, databases and firmware that process PII;
— hardware used by systems that process PII (client, servers, mobile computing devices, IoT devices,
external and portable memory devices, etc.);
— organizational attributes, such as reputation, that can be damaged by an adverse privacy event.
Once assets are identified, the organization can consider threats and vulnerabilities, both o f which can
be related to or originate from:
— the organization itsel f;
— processes and procedures;
— management routines;
— personnel;
— system configuration;

© I S O/I E C 2 0 2 2 – Al l r ights res er ve d 7


ISO/IEC 27557: 202 2(E)

— technical or organizational measures or lack thereo f;


— hardware, so ftware, or services;
— external parties;
— physical environment (e.g. physical archives, physical documents);
— hardware, so ftware, or services (e.g. websites, email).
NOTE Common in formation security-related vulnerabilities are described in ISO/IEC 27005.

6 . 4 . 2 . 5 I d e n ti fi c a ti o n o f e x i s ti n g c o n tro l s

The organization should identi fy controls relevant to processing PII. Controls may be previously
documented (in internal systems, procedures, audit reports, etc.) or identified during the risk
management activities. Those organizations implementing a PIMS can also re fer to the controls selected
as part o f the required statement o f applicability.
6 . 4 . 2 . 6 I d e n ti fi c a ti o n o f b i a s e s , a s s u m p ti o n s a n d b e l i e f
s

The organization should identi fy and evaluate the biases, assumptions, and belie fs o f those involved
with the PII processing or design o f products and services that process PII, as well as how these can
a ffect the data analytic inputs and outputs used in PII processing.
6 . 4 . 2 . 7 I d e n ti fi c a ti o n o f p r i va c y i m p a c ts a n d c o n s e q u e n c e s

As part o f organizational privacy risk management, the organization should identi fy privacy impacts
on individuals whose PII is being processed, as well as consequences to the organization itsel f. Impacts
or consequences are the results o f a privacy event.
Consequences for the organization necessarily di ffer from those o f privacy impacts to individuals.
Consequences to organizations can be, for example:
— investigation and repair time;
— customer abandonment o f products or services;
— harm to internal organizational culture;
— (work)time lost;
— opportunity lost;
— financial cost o f specific skills to repair the damage;
— impairment o f image reputation, and goodwill;
— regulatory fines;
— loss o f physical assets.
Privacy impacts on individuals are an externality for the organization. There fore, it can be di fficult for
the organization to exactly estimate the impact on each individual. Rather than speci fying each type
o f impact, this can be considered generally as an impact on an individual’s privacy, with the degree o f
impact being the critical element (see 6.4.3.3 and Annex D). Organizations that per form privacy impact
assessments as part o f a PIMS or to comply with regulatory or legal requirements can utilize these for
help in identi fying the privacy impacts.
Annex C provides examples o f privacy impacts in Table C.1 and consequences in Table C.2 .

8 © I S O/I E C 2 0 2 2 – Al l r ights res er ve d


ISO/IEC 27557: 202 2(E)

6.4.3 Risk analysis

6.4.3 .1 General

The guidance in ISO 31000:2018, 6.4.3 and the following additional guidance applies.
6.4.3 .2 Risk analysis approaches

Risk analysis approaches typically include a risk analysis process, a risk model, and an analytic
approach.
A risk analysis process describes the steps for carrying out a risk analysis (e.g. prepare for the analysis,
conduct the analysis, communicate analysis and results, and maintain the analysis).
A risk model defines the risk factors (e.g. threats, impacts, consequences, vulnerabilities, controls)
to be assessed and the relationships among those factors. I f not using a pre-defined risk model, the
organization defines which risk factors will be assessed and the relationships among them.
An analytic approach includes the analysis type (i.e. quantitative, qualitative, semiquantitative) and
analysis approach (i.e. threat-oriented, asset/impact-oriented, vulnerability-oriented), which help the
organization determine how, with what level o f detail, and in which form risk factors are to be analysed.

6.4.3 .3 Assessment of privacy impact and consequence

When assessing the privacy impacts and consequences identified in the risk assessment, the
organization should distinguish between an assessment o f consequences to the organization and a
privacy impact assessment for individuals.
Business analyses should determine the degree to which the organization can be a ffected by an adverse
e ffect o f PII processing or privacy events and consider elements including but not limited to:
— value (qualitative or quantitative) or criticality o f product or service;
— threats, vulnerabilities, and privacy events applicable to PII processing activities;
— tangible consequences such as immediate and medium-term costs (e.g. costs for the immediate
work around and for the final solution, fines);
— intangible consequences (e.g. to the brand), that can have long-term costs, loss o f customers;
— criteria used to establish the overall consequence (as determined in 6.3.4);
— existing protections and mitigating controls around PII (e.g. de-identification).
Privacy consequences reflect the extent to which there can be a direct adverse e ffect, or cost, to the
organization as a result o f a privacy event.
Privacy impact reflects the degree to which PII principals can be a ffected by privacy events. Privacy
impact analyses may be per formed as part o f the privacy risk assessment, or they can be standalone
processes (e.g. as part o f a PIMS or as required by regulation or law). The organization should consider
elements such as:
— types o f PII processed;
— amount o f PII processed;
— use o f the PII that is processed (purpose for the processing);
— protections and mitigating controls around PII (e.g. de-identification);
— jurisdictional and cultural environment o f the PII principal (which can a ffect how the relative impact
is determined) .

© I S O/I E C 2 0 2 2 – Al l r ights res er ve d 9


ISO/IEC 27557: 202 2(E)

A fter considering the potential impacts on PII principals and consequences for the organization, the
criteria for per forming the risk assessment (6.3.4) can be used to estimate the level o f impact and
consequence o f each privacy event, in order to rank them.
An example o f a scale for privacy impacts is provided in Annex D and a template for impact level in
Table D.1 .
6.4.3 .4 Assessment of likelihood

The organization should assess the likelihood o f privacy events. Likelihood can be determined on a
qualitative or quantitative scale and should align to the criteria established as part o f 6.3.4. Likelihood
can be a ffected by (but not limited to):
— organizational factors (e.g. geographic location, the public perception about participating
organizations with respect to privacy);
— system factors [e.g. the nature and history o f individuals’ interactions with the system; visibility
o f data processing to individuals and third parties; types, severity, and number o f vulnerabilities;
frequency, severity, and pervasiveness o f threats; success (mitigation) or failure o f controls (6.4.2.4)];
or

— individuals’ factors (e.g. individuals’ demographics, privacy interests or perceptions, data


sensitivity).

6.4.4 Risk evaluation

The guidance in ISO 31000:2018, 6.4.4 and the following additional guidance applies.
When evaluating the results o f the privacy risk analysis against the established risk criteria, the
organization should consider the following when determining the response to risk:
— the degree o f acceptable risk, which can vary depending on the context, including the applicable
legislation;
— whether findings in the risk analysis a ffect the obligations to PII principals, including those
mandated by applicable legislation.

6.5 Risk treatment

6.5 .1 General

The guidance in ISO 31000:2018, 6.5.1 applies.

6.5 .2 Selection of risk treatment options

6.5 .2 .1 General

The guidance in ISO 31000:2018, 6.5.2 and the following additional guidance applies.
When selecting risk treatment options, the organization should consider options not only in the context
o f their own organization but also in relation to the privacy o f individuals. For example, i f a risk is
identified based on a high impact on the individual and consequence to the organization, that risk may
be treated by enhanced privacy by design controls (e.g. encryption, pseudonymization), thus enhancing
privacy protections for individuals, as well as protecting the organization from reputational damage or
regulatory fines.
I f implementing a privacy in formation management system (PIMS), such as in ISO/IEC 27701, the
organization should use the risk treatment process to help guide the selection o f controls to be included
in the statement o f applicability.

10 © I S O/I E C 2 0 2 2 – Al l r ights res er ve d


ISO/IEC 27557: 202 2(E)

6 . 5 . 2 . 2 R i s k m o d i fi c a ti o n

C ontrol s ca n b e adde d, remove d , or mo d i fie d to re duce the l i kel i ho o d o f ri sk to what i s accep table to the

orga ni z ation . C ontrol s appl ie d to privac y ri s k c an b e i n forme d b y lo c a l l aws and regu lation s , a s wel l a s

b e s t prac tice s . I S O/I E C 2 7 701 provide s control s that may b e s ele c te d b y a n organ i z ation to i mp lement a

PI M S b as e d on a ri s k appro ach .

6.5 .2 .3 Risk retention

Ri s k s hou ld on ly b e re ta i ne d i f it me e ts the b a s el i ne c riteria for ri s k accep tance . T ho s e c riteri a shou ld

b e de term i ne d b y the organ i z ation b a s e d on the context.

6.5 .2 .4 Risk avoidance

T he organ i z ation c an cho o s e to avoid an identi fie d ri sk b y mo d i fyi ng or c a ncel l i ng ac tivitie s (or pla n ne d

ac ti vitie s) that cre ate the ri sk. I n the contex t o f pro ce s s i ng PI I , th i s c an i nclude, for e xample, cho o s i ng

no t to pro ce s s a p ar tic u lar typ e o f PI I .

6.5 .2 .5 Risk sharing

Ri s k c an b e sh are d b e twe en the orga n i z ation and a n e xterna l p a r ty, wh ich c a n i nclude a bu s i ne s s

p a r tner, s ub contrac tor, or even a c u s tomer. I n the conte xt o f PI I pro ce s s ors , for example, the pro ce s s or

may o ffer a s er vice for wh ich the c u s tomer ha s control over cer tai n as p e c ts o f ri s k (e . g. whe ther to

enable enc r yp tion or re qui re two - fac tor authentic ation) . When s hari ng ri sk, the organ i z ation s hou ld

con s ider new or add itiona l ri sks c re ate d b y the e xterna l rel ation sh ip .

For exa mple, contrac ts are a me an s o f sh ari ng or tran s ferri ng ri s k to o ther organ i z ation s (s uch a s

th rough an i n s u rance p ol ic y) .

6.5 .3 Preparing and implementing risk treatment plans

T he gu idance i n I S O 3 10 0 0 : 2 01 8 , 6 . 5 . 3 and the fol lowi ng add itiona l gu idance appl ie s .

P rivac y ri s k tre atment plan s a nd the re s p on s ibi l itie s for thei r i mplementation s hou ld b e reviewe d a nd

approve d b y management.

6.6 Monitoring and review

T he gu idance i n I S O 3 10 0 0 : 2 01 8 , 6 . 6 a nd the fol lowi ng add itiona l gu ida nce appl ie s .

T he organ i z ation s hou ld mon itor for cha nge s th at may a ffe c t privac y ri sk. T he s e ch ange s may

ne ce s s itate the review o f organ i z ationa l privac y ri s k management pro ce s s e s and ri s k tre atment pla n s

for change s or i mprovements . T he organ i z ation c an mon itor, for exa mple:

— the orga n i z ation’s bu s i ne s s envi ron ment (e . g. the i ntro duc tion o f new te ch nolo gie s) ;

— new or a ltere d lega l obl igation s;

— change s to ri sk tolerance;

— new or a ltere d PI I pro ce s s i ng or I T s ys tem s;

— new privac y events , th re ats , a nd vu l nerabi l itie s d i s clo s e d to the orga n i z ation from i nterna l or

e xterna l s ou rce s;

— conti nue d e ffe c tivene s s o f plan ne d m itigation ac tivitie s;

— the de ci s ion s o f the relevant s up er vi s or y authoritie s .

© I S O/I E C 2 0 2 2 – Al l r ights res er ve d 11


ISO/IEC 27557: 202 2(E)

6.7 Recording and reporting

T he gu ida nce i n I S O 3 10 0 0 : 2 01 8 , 6 .7 and the fol lowi ng add itiona l gu idance appl ie s .

B o th organ i z ation s a nd i nd ividua l s may ne e d to know how PI I i s pro ce s s e d i n order to manage privac y

ri s k e ffe c ti vely. T h rough i nterna l and e xterna l rep or ti ng on PI I pro ce s s i ng , a s s o ci ate d privac y ri sks , and

ri s k tre atments , the organ i z ation c an i ncre as e tra n s p arenc y, e s tabl i s h a more rel i able u nders tand i ng o f

PI I pro ce s s i ng ac tivitie s , a nd bu i ld tru s t.

O rga ni z ationa l privac y ri sks s hou ld be rep or te d to top management, i nclud i ng a s u m mar y ri sk

as s e s s ment and ri s k ma nagement rep or t.

NO TE E x tern a l rep o r ti n g i s u s u a l l y d i fferent fro m i ntern a l (e . g. a s u m m a r y rep o r t i n s te ad o f a de ta i le d

rep or t) .

12 © I S O/I E C 2 0 2 2 – Al l r ights res er ve d


ISO/IEC 27557: 202 2(E)

Annex A
(informative)
P I I p ro c e s s i n g i d e n ti fi c a ti o n

Identi fying the specific PII and processing enables an organization to more accurately assess the
privacy impact on individuals and consequences to the organization itsel f should privacy events or
threats materialize. Several key elements can be inventoried to illustrate or map out the organization’s
PII processing and its context to support organizational privacy risk management:
— PII processing activities (e.g. collection, storage, alteration, retrieval, consultation, disclosure,
anonymization, pseudonymization, dissemination or otherwise making available, deletion, or
destruction);
— types o f PII (see Table A.1);
— purposes o f the PII processing (see Table A.2);
— PII processing environment (e.g. geographic location, internal cloud, third parties);
— assets on which PII processing activities rely (e.g. systems/products/services that process PII);
— categories o f individuals whose PII is being processed (e.g. customers, employees, prospective
employees, or consumers);
— individuals or personnel processing PII (e.g. the organization or third parties such as service
providers, partners, customers, and developers);
— role o f entities engaged in PII processing (e.g. internal or external).
Distinguishing types and uses o f PII can help organizations further refine privacy impacts and
consequences. Di fferent data types and uses can have higher or lower relative values or sensitivity
depending on the context, and organizations can choose to assign sensitivity “ratings” depending on
this context. Organizations can utilize published taxonomies to assist in identi fying types and uses o f PII
(for example, a taxonomy for cloud computing and distributed platforms is found in ISO/IEC 19944-1)
or use existing classification systems internal to the organization. Table A.1 provides examples o f PII
data types, while Table A.2 provides example uses o f PII data.

© I S O/I E C 2 0 2 2 – Al l r ights res er ve d 13


ISO/IEC 27557: 202 2(E)

Table A.1 — E xample PII data types

E xamples

Civil status, identity, identification data (social security or other identification number)
Contact data (phone number, address, e-mail, etc.)
Personal li fe (living habits, marital status, philosophical, political, religious and trade-
union views, sex li fe, health data, racial or ethnic origin, o ffences/convictions, etc. –
excluding sensitive or dangerous data)
Pro fessional li fe (résumé, education and pro fessional training, awards, etc.)
Economic and financial in formation (bank data, income, financial situation, tax situation,
e tc .)

Connection data (IP addresses, event logs, etc.)


Location data (travels, GPS data, GSM data, etc.)
Credentials (PIN, passwords, knowledge-based authentication data, etc.)
Biometric data

Table A. 2 — E xample uses of PII data

E xamples

Provide a service or product (e.g. fulfilling an order, processing a bank deposit, business
process execution)
Improve a service or product (e.g. to increase the quality)
Personalize a service or product (e.g. to tailor to a particular user)
O ffer upgrades/upsell a service or product (e.g. for additional products or
services, or additional capabilities o f existing products or services)
Market/advertise/promote a service or product
Share (e.g. share or trans fer PII to one or more third-parties)
Design or develop a service or product
Develop data models or train AI systems
Prevent fraud

14 © I S O/I E C 2 0 2 2 – Al l r ights res er ve d


ISO/IEC 27557: 202 2(E)

Annex B
(informative)
Example privacy events and causes

Table B.1 provides a non-exhaustive list o f privacy events that organizations that may create impacts or
consequences for an organization, regardless o f whether these items have an impact on the privacy o f
PII principals in a given situation.

Table B .1 — Privacy event examples

Privacy events Description

Appropriation PII is used in ways that exceed an individual’s expectation or authorization (e.g.
implicit or explicit). Appropriation includes scenarios in which the individual would
have expected additional value for the use given more complete in formation or ne -
gotiating power. Privacy impacts due to appropriation can lead to loss o f trust, loss
o f autonomy, and economic loss.
D is tortion Inaccurate or misleadingly incomplete PII is used or disseminated. Distortion can
present individuals in an inaccurate, unflattering, or disparaging manner, opening
the door for stigmatization, discrimination, or loss o f liberty.
Induced disclosure Induced disclosure can occur when individuals feel compelled to provide in for-
mation disproportionate to the purpose or outcome o f the transaction. Induced
disclosure can include leveraging access or rights to an essential (or perceived
essential) service. It can lead to impacts such as discrimination, loss o f trust, or loss
o f autonomy.
Insecurity Lapses in PII security can result in various privacy impacts, including loss o f trust,
exposure to economic loss and other identity the ft-related harms, and dignity loss -
es .

Re-identification De-identified PII, or PII otherwise disassociated from specific individuals, becomes
identifiable or associated with specific individuals again. It can lead to impacts such
as discrimination, loss o f trust, or dignity losses.
Stigmatization PII is linked to an actual identity in such a way as to create a stigma that can cause
dignity losses or discrimination. For example, transactional or behavioural PII such
as the accessing o f certain services (e.g. food stamps or unemployment benefits) or
locations (e.g. health care providers) may create in ferences about individuals that
can cause dignity losses or discrimination.
Surveillance PII, devices, or individuals are tracked or monitored in a manner disproportion-
ate to the purpose. The di fference between benign PII processing and the privacy
event o f surveillance can be narrow. Tracking or monitoring may be conducted for
operational purposes such as cybersecurity or to provide better services, but it can
become surveillance when it leads to privacy impacts such as discrimination; loss
o f trust, autonomy, or liberty; or physical harm.
Unanticipated revela- PII reveals or exposes an individual or facets o f an individual in unexpected ways.
tion Unanticipated revelation can arise from aggregation and analysis o f large or di-
verse data sets. Unanticipated revelation can give rise to dignity losses, discrimina-
tion, and loss o f trust and autonomy.

© I S O/I E C 2 0 2 2 – Al l r ights res er ve d 15


ISO/IEC 27557: 202 2(E)

Table B .1 (continued)

Privacy events Description

Unwarranted res tric- Unwarranted restriction includes not only blocking access to PII or services, but
tion also limiting awareness o f the existence o f PII or its uses in ways that are dis -
proportionate to operational purposes. Operational purposes may include fraud
detection or other compliance processes. When individuals do not know what PII
an entity has or can make use o f, they do not have the opportunity to participate
in decision-making. Unwarranted restriction also diminishes accountability as to
whether the data are appropriate for the entity to possess, or whether it will be
used in a fair or equitable manner. Lack o f access to PII or services can lead to pri-
vacy impacts in the loss o f sel f-determination category, loss o f trust, and economic
harm.
Table B.2 provides a non-exhaustive list o f potential causes o f privacy events that may create impacts
or consequences for an organization, regardless o f whether these items have an impact on the privacy
o f PII principals in a given situation. The indicated potential causes have been derived from privacy
principles listed in ISO/IEC 29100 and privacy threats listed in ISO/IEC TR 27550.

Table B . 2 — Potential causes of privacy events

Potential causes
Description
of privacy events

Lack of lawfulness, fair- PII is not processed law fully or fairly, or in a hidden manner in relation to the data
ness and transparency subject.
Lack of purpose limita- PII is not collected for specified, explicit or legitimate purposes, or further pro -
tions cessed in a manner that is incompatible with those purposes.
Lack of data minimiza- The collection o f PII is not adequate, relevant or limited to what is necessary in
tion relation to the purposes for which they are processed.
Lack of accuracy PII is not accurate or, where necessary, kept up to date.
Lack of s torage limita- PII is kept in a form which permits identification o f data subjects for longer than is
tions necessary for the purposes for which the PII is processed.
Identification The link between an identity and a piece o f in formation is not su fficiently hidden,
so that identities can be revealed when some e ffort is invested. A similar scenario
is related to linkability: several items o f anonymized or pseudonymised data can be
linked together to reveal the identity o f a person. While anonymous data does not
match any particular person within a set o f people, this set can be narrowed down
to a single person when several data sets are combined.
Non-repudiation Data can be used to prove that a particular person has committed a particular
action.
D etection of exis tence Uninvolved parties can determine i f a particular item o f interest (such as a data
record, e.g. a criminal record, an action, an event) exists or not. The knowledge o f
the existence o f such an item can o ften damage the PII principal and be used to in fer
more in formation about a person.
Unawareness An internal person owns or discloses PII o f PII principals, without being aware o f
the nature o f the in formation, or o f the legal implications.

16 © I S O/I E C 2 0 2 2 – Al l r ights res er ve d


ISO/IEC 27557: 202 2(E)

Annex C
(in fo rmative)

Privacy impact and consequence examples

Table C .1 provide s non- exh au s tive exa mple s o f privac y i mp ac ts on i nd ividua l s th at c an ari s e from
privac y events .

Table C .1 — Privacy impacts on individuals

Impact Description

D ignity loss I nclude s emb a r ra s s ment a nd emo tion a l d i s tre s s .

D iscrimination Un fa i r or u ne th ic a l d i fferenti a l tre atment o f i nd i vidu a l s whe ther s i ngl y or a s a

group a r i s i ng from the pro ce s s i n g o f data .

Economic loss C a n i nclude d i re c t fi n a nc i a l lo s s e s a s the re s u lt o f identity the ft or the fa i lu re to

re cei ve fa i r va lue i n a tra n s ac tion .

Loss of self- determina- L o s s o f autonomy: I nclude s lo s i ng control o ver de ter m i n ation s ab out i n for m atio n

tion p ro ce s s i ng or i nterac tion s with s ys tem s/pro duc ts/s er vice s , a s wel l a s ne e d le s s

ch a n ge s i n o rd i n a r y b eh aviou r, i nclud i ng s el f-i mp o s e d re s tr ic tion s on e xp re s s ion or

c i vic engagement.

L o s s o f l ib er ty: I nco mp le te o r i n acc u rate d ata c a n le ad to i mpro p er e x p o s u re to

a r re s t o r de ta i n ment. I mpro p er e xp o s u re or u s e o f i n fo rm ation c a n co ntr ib ute to

abu s e s o f go ver n menta l p ower.

Phys ic a l h a rm: Phys ic a l h a r m o r de ath .

Loss of trus t T he b re ach o f i mp l ic it or e xp l ic it e xp e c tation s or agre ements ab out the pro ce s s i n g

o f d ata . T he s e b re ache s c a n d i m i n i s h mora le o r le ave i nd i vidu a l s reluc ta nt to en -

gage i n fu r ther tra n s ac tio n s p o tenti a l l y c re ati ng l a rger e co no m ic or c ivic con s e -

quence s .

Table C . 2 provide s non- e xhau s tive example s o f privac y con s e quence s to the orga ni z ation that c an a ri s e

from privac y events .

Table C . 2 — Consequences for organizations

Impact Description

Noncompliance cos ts Re g u l ato r y fi ne s , l itigation co s ts , reme d i ation co s ts .

D irect business cos ts Re venue or p er for m a nce lo s s fro m c u s to mer ab a ndon ment or avoid a nce .

Damage to reputation B ra nd d a m age , lo s s o f c u s to mer tr u s t.

Harm to internal organ- I mp ac t o n c ap ab i l ity o f o rga n i z atio n/u n it to ach ie ve vi s io n/m i s s ion , i mp ac t o n p ro -

izational culture duc ti vity/emp lo ye e mo ra le s tem m i ng fro m co n fl ic ts with i nter n a l c u ltu ra l va lue s

or e th ics .

© I S O/I E C 2 0 2 2 – Al l r ights res er ve d 17


ISO/IEC 27557: 202 2(E)

Annex D
(in fo rmative)

Template showing the severity scale for privacy impacts on


individuals

O rga ni z ation s c an u s e the template s hown i n Table D.1 a s a gu ide for e s ti mati ng the s everity o f privac y

i mp ac ts on i nd ividua l s . S everity o f privac y i mp ac ts c an var y gre atly dep end i ng on the contex t, i nclud i ng

c u ltu ra l va lue s and norm s . T he s p e c i fic contex t s u rrou nd i ng the pro ce s s i ng o f PI I c an b e help fu l i n

gu id i ng the e s ti mation o f privac y i mp ac ts a s relatively h igh or low s everity. O rgan i z ation s c an u s e

the fol lowi ng con s ideration s , and o thers as de s i re d, to comple te the conte xt colu m n a nd e s ti mate the

s everity o f privac y i mp ac ts to i nd ividua l s .

S ome contex tua l con s ideration s for e s ti mati ng the s everity o f privac y i mp ac ts to i nd ividua l s i nclude

(s e e a l s o 6 .4. 3 . 3 ) :

— de gre e o f PI I s en s itivity, i nclud i ng s p e ci fic elements or col le c tively;

— typ e s o f PI I b ei ng pro ce s s e d, and nu mb er o f d i fferent typ e s (e . g. the more typ e s b ei ng pro ce s s e d c an

re s u lt i n a gre ater i mp ac t) ;

— u s e s o f the PI I b ei ng pro ce s s e d (e . g. more u s e s o f PI I c a n re s u lt i n a gre ater i mp ac t) ;

— whe ther data are de -identi fie d (e . g. anonym i z e d , p s eudonym i z e d , e tc .) ;

— demo graph ics a nd privac y i ntere s ts or p ercep tion s o f i nd ividua l s;

— du ration or fre quenc y o f pro ce s s i ng ac tivitie s;

— vi s ibi l ity o f pro ce s s i ng ac tivitie s to i nd ividua l s and th i rd p ar tie s;

— nu mb er o f PI I pri ncip a l s;

— quantity o f PI I b ei ng pro ce s s e d .

Table D.1 — Impact level template

Impact Level Description Context

1 . N egligible PI I p ri nc ip a l s either a re no t a ffe c te d or m ay encou nter a few i nco n -

ven ience s , wh ich the y c a n o verco me without a ny p rob lem .

2 . Li m ite d PI I p ri nc ip a l s m ay enco u nter s ign i fic a nt i nconven ience s , wh ich the y

a re ab le to overcome de s p ite a fe w d i ffic u ltie s .

3 . S ign i fic a nt PI I p ri nc ip a l s m ay enco u nter s ign i fic a nt i mp ac ts , wh ich the y s hou ld b e

ab le to overcome a lb eit with re a l a nd s er iou s d i ffic u l tie s .

4. M a xi mu m PI I p ri nc ip a l s m ay enco u nter s ign i fic a nt, o r even i r re vers ib le , i mp ac ts ,

wh ich the y m ay no t o verco me .

18 © I S O/I E C 2 0 2 2 – Al l r ights res er ve d


ISO/IEC 27557: 202 2(E)

Bibliography

[1] I S O/ I E C Security techniques — Extension to ISO/IEC 27001 and ISO/IEC 27002 for privacy
2 7 70 1 ,

information management — Requirements and guidelines


[2 ] I S O/ I E C 2 91 3 4, Information technology — Security techniques — Guidelines for privacy impact
assessment
[3 ] I S O/ I E C 19 9 4 4 -1 , Cloud computing and distributed platforms ─ Data flow, data categories and data
use — Part 1: Fundamentals
[4] I S O/ I E C Information security, cybersecurity and privacy protection — Guidance on
2 70 0 5 ,

managing information security risks


[5 ] I S O/ I E C / T R Information technology — Security techniques — Privacy engineering for
2 75 5 0 ,

system life cycle processes

© I S O/I E C 2 0 2 2 – Al l r ights res er ve d 19


ISO/IEC 27557: 202 2(E)

ICS 35.030
Price based on 1 9 pages

© I S O/I E C 2 0 2 2 – Al l r ights re s er ve d

You might also like