0% found this document useful (0 votes)
84 views48 pages

Coassurance KULeuven Distrinet

Uploaded by

gcollijs
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
84 views48 pages

Coassurance KULeuven Distrinet

Uploaded by

gcollijs
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 48

IMPACT OF

EU SECURITY REGULATION
AND STANDARDS
ON THE DEVELOPMENT OF
DIGITAL SYSTEMS,
AND THE INTERPLAY WITH
SAFETY CONCERNS.
Basic concepts and guidelines
3
4
index
Introduction 6

1. EU Cybersecurity Regulation  10

The Road Through History Towards a Cybersecure Europe 10


The Cybersecurity Act (CSA) 12 5
NIS Directive 12
RED Delegated Act (RED-DA) 14
Cyber Resilience Act (CRA) 16
Summary  17

2. Representative Cybersecurity Standards 18

Guidelines versus Standards 18


IEC 62443: Overview 20
IEC 62443-3: System Concerns 22
IEC 62443-4: Component Concerns 25
ETSI EN 303 645 28
Common Criteria 29
NIST 30

3. Co-engineering Cybersecurity and Safety  32

Brief Introduction to Functional Safety and its Standards 32


Co-engineering Cybersecurity and Safety 34
Security & Safety Standards and Guidelines 36
Challenges 40
Practical Implementation Notes 40
Conclusion 41

Closing Notes 43
Disclaimer 44
Sources and References 44
6

Introduction
The increasing adoption of digital technology has transformed the way we live, work,
and interact with the world. This digital revolution brings innovation and convenience.
However, it comes with a cost. The security of digital systems, encompassing both
Consumer IoT and Industrial IoT (IIoT) has proven to be inadequate. In fact, security
has become the most important concern for many manufacturers in the digital age.
The ever-evolving landscape of cyber threats together with the increased dependency
on digital systems have created an urgent need for improving the security of digital
systems.

In this context, manufacturers find themselves at a crossroads. On the one hand, they
lack cybersecurity knowledge and experience. On the other hand, they face the chal-
lenge of attracting and retaining skilled professionals. The risks are quickly increasing,
and the consequences are far-reaching. Both the impact and probability of cyber
incidents increases due to multiple factors.
• The impact of cyber incidents increases due to multiple reasons. First, modern
society heavily relies on digital systems, from smart homes to critical infrastructure.
This digital dependence amplifies the consequences of security breaches. Second,
the financial cost of security breaches can be substantial, not just in terms of
immediate monetary loss but also in terms of long-term damage to reputation and
customer trust. Finally, the Internet of Things (IoT) has seen an explosive prolifer-
ation of interconnected devices. While this connectivity brings convenience, it also
expands the attack surface for cybercriminals.
• The probability increases due to two major reasons. First, as digitalization is en-
tering across all industries, the likelihood of cyberattacks increases proportionally.
Second, the financial incentives for hacking are growing, making cybercrime a
lucrative endeavor.
7

Manufacturers face severe risks due to the higher likelihood and magnitude of breach-
es. As a testament to these concerns, the current status of IoT in Europe (EU) reveals
a pressing need for enhanced security measures. Let us give a few examples:
• The average EU household now boasts 25 connected devices.
• On average, e eight IoT-related attacks are recorded per home network, every 24
hours.
• Vulnerable IoT devices span a range of household items, including TVs, smart plugs,
routers, DVRs, extenders, and IP cameras.

The need for robust security in digital systems is clear, yet implementing comprehen-
sive security measures in manufacturing is a costly and intricate endeavor. As a result,
new systems still lack proper security measures.

In the meantime, over the past decade, the European Union has witnessed a clear
acceleration in regulatory activity in the field of cybersecurity to tackle these issues.
The regulations push organizations to raise the bar when it comes to digital security. In
fact, as from August 2025, manufacturers will have to consider legal requirements for
cybersecurity measures in the design and production to gain approval for placing their
products in the EU market.

This, however, presents huge challenges. Understanding the relevant security standards
and regulations, and the impact on the development process and the organization are
some of the hurdles that manufacturers must overcome.
8

Furthermore, the intricate relationship between cybersecurity and safety is charac-


terized by complexity and frequent conflicts. For instance, cybersecurity expects fast
response handling: implement and deploy a patch as soon as possible, to reduce the
time for potential attackers to exploit a vulnerability. On the contrary, guaranteeing
a threshold safety level can be a labor-intensive process. Only when all necessary
steps have been taken and documented, can certification occur (if necessary) and
the product be put on the market. When a
change is required -- either a bug fix or ap-
plying a security patch -- its impact must be
analyzed with respect to the product safety.
As from august 2025,
Sometimes, a re-certification is even neces- manufacturers will have to
sary. The outcome of this analysis can be to consider legal requirements
disregard the change because its benefits do
not outweigh the cost or to delay it so it can for cybersecurity measures
be implemented in an already planned major in the design and
update. To tackle the above-mentioned
production process.
concerns, increased attention is given to the
co-engineering of safety and cybersecurity.

This document delves into the multifaceted realm of the EU's safety and security
regulations and standards. It explores the foundational concepts and offers practical
insights for manufacturers to understand this evolving landscape successfully. As
digital transformation continues to reshape our world, the safety and security of digital
systems stand as paramount concerns, and this document aims to serve as a valuable
resource in addressing these challenges.
10

1.
EU Cybersecurity
Regulation

The Road Through History Towards a Cybersecure Europe

In recent years, the European Union has addressed the growing cybersecurity chal-
lenges. Particularly in the past decade, the activities have clearly accelerated. This
section provides an overview of the evolution of the EU’s approach to cybersecurity
impacting the manufacturing of digital devices.

Before 2001: A scattered landscape. Prior to 2001, it was a maze of non and legally
binding instruments. The EU was lacking a strong and unified approach to cybersecu-
rity and there was no responsible agency.

2001: Network and Information Security: Proposal for an EU Policy approach. In


2001, the European Union presented ‘the Network and Information Security Pro-
posal’. Due to the Maastricht Treaty in 1992, also known as the Treaty on European
Union, the scope of the European Commission was limited to issues with an economic
character. As a result, the proposal was highly prescriptive with a predominantly eco-
nomic perspective.
2004: Founding ENISA. The European Network and Information Security Agency
(ENISA) was established in 2004. It aimed at developing an elevated level of network
and information security and a cybersecurity culture. It serves as a center of expertise
for sharing experiences, guidelines, and best practices.

2007: The Estonian Cyberattack and the Emergence of Cyberwar. In 2007, a pivotal
event occurred when cyberattacks were launched against Estonia. While its material
impact was limited, it had a major impact on politics. These attacks showed the poten-
tial of state actors to disrupt the market and the potential to use cyberspace for war
and terrorism. It thrusted cybersecurity into the political domain.

2007-2013: Increased efforts to secure cyber infrastructure. In subsequent years,


efforts were increased to secure critical cybersecurity infrastructure with an emphasis 11
on Critical Information Infrastructure Protection (CIIP). Political cooperation and
information exchange were improved through the new European Forum for Member
States (EFMS).

2013 - 2020: A European Cybersecurity Strategy. In


2013, the European Cybersecurity Strategy was intro-
duced focusing on raising awareness, enhancing cyber
resilience, and developing cyber defense capabilities. It
led, among others, to the creation of the NIS Directive
in 2016.

Cybersecurity became a top Priority for EU’s Future.


An updated Cybersecurity Strategy (2020) delved
even deeper into addressing cybersecurity challenges.
It further boosted the security of essential services and
connected devices, enhanced support for SMEs, secure
5G networks and established ambitious standards of
the Internet of Things (IoT).

In summary, the road to a cybersecure Europe has


been marked by a series of milestones and pitstops in
response to the evolving cybersecurity challenges. The
EU has transitioned from a scattered approach to a
more cooperative, and proactive stance, acknowledging
the importance of cybersecurity in an interconnected
and digitalized world.
The Cybersecurity Act (CSA)

The Cybersecurity Act, adopted in 2019, consists of two key components:

A EUROPEAN CYBERSECURITY CERTIFICATION FRAMEWORK


This part aims at developing EU Cybersecurity schemes for ICT products and services
that define a set of rules, technical requirements, standards, and procedures for
their certification. The primary goal is to gain recognition across all member states.
It includes cybersecurity requirements, the type of evaluation (e.g., self-assessment
or third-party evaluation) and three levels of assurance (basic, substantial, and high)
12 signifying the degree of protection offered by those products and services.

While the EU Common Criteria scheme is currently the most advanced, several others
are in development, each tailored to specific product and service categories.

ENISA: THE EUROPEAN AGENCY FOR CYBERSECURITY


The Cybersecurity Act has also rebranded ENISA’s role into the “European Agency for
Cybersecurity”. It now holds a permanent mandate and takes up new responsibilities,
including policy development and implementation. Furthermore, it acts as a hub for
knowledge and information bundling and sharing, consolidating cybersecurity infor-
mation from EU institutions and bodies. ENISA has a major role in the cooperation
among Computer Security Incident Response Teams (CSIRTs) at EU level and assists
Member States to manage cybersecurity incidents.

NIS Directive

The Network and Information Security Directive (NIS Directive, 2016) is an import-
ant initiative within the European Union striving to achieve a high common level of
security of network and information systems across. Belgium has implemented the
NIS Directive in 2019. This directive comprises several key aspects to bolster the EU’s
cybersecurity landscape:

1. Improved Cybersecurity Capabilities. Under the NIS Directive, member states


must adopt a National Cybersecurity Strategy on the security of NIS. More-
over, they must appoint a single point of contact for cross-border cooperation,
set up a National Competent Authority for the application of the directive and
establish CSIRTs that provide incident support 24/7. While there are sector spe-
cific CSIRTs, the Center for Cybersecurity Belgium (CCB) is implementing
these roles for Belgium.
2. Increased EU-level cooperation. To combat cybersecurity threats and manage
incidents, the directive emphasizes the importance of EU-level cooperation. A
cooperation group supports and facilitates cooperation, exchanges information
and best practices among member states and establishes a network of National
CSIRTs. This network helps in developing trust and confidence among Member
States and allows an effective operational cooperation.

3. Risk Management and Incident Reporting. One of the NIS Directive's cen-
tral pillars is risk Management and Incident Reporting to safeguard the continu-
ity and public security of critical social or economic services. It applies to Oper-
ators of Essential Services (OES) and Digital Service Providers (DSP), even for
non-EU service providers active in the EU. They must:
• take technical and organizational measures to avoid incidents or limit their
13
impact.
• develop a security policy (cfr. ISO/IEC 27001).
• report incidents to CCB as the national CSIRT of Belgium.
• perform an annual internal audit and an external audit every three years (at
its own expense).
• designate a point of contact for competent authorities.

When such service providers fail to comply with regulations, they will be sanc-
tioned with administrative and criminal fines or even a prison sentence of up to
two years.

The NIS Directive, adopted in 2016, had several inefficiencies which rendered certain
actions ineffective. For instance, national implementations of different Member States
varied too much, and monitoring and enforcement were often ineffective.

In 2020, the European Commission presented a


new Cybersecurity Strategy, including a revised
When service providers NIS Directive (NIS2). It has increased the
fail to comply with scope of the organizations that need to comply,
required stricter risk management measures and
regulations, they will be better cooperation.
sanctioned with fines or
While the NIS directive and its revision NIS2, are not
prison sentence. specifically targeted at manufacturers, it may have
a substantial impact as manufacturers often provide
products and services to organizations falling within
the scope of OES and DSP.
RED Delegated Act (RED-DA)

The Radio Equipment Directive (RED) originally enacted in 2014, plays a crucial role
in ensuring the safety, efficiency, and compliance of radio equipment placed on the
European Union market.

It defines specific requirements related to safety and health, EMC compatibility and
efficient use of the radio spectrum. To demonstrate compliance, harmonized standards
are established for which certification holds across the EU.

Certain critical aspects such as interoperability, cybersecurity, access to emergency


14 services and for the combination of radio equipment and software, the directive only
provides a basis for further regulation requiring additional regulatory measures in the
form of delegated acts.

To fortify security in radio equipment, in 2022, a RED Delegated Act regarding


cybersecurity was finally adopted. Hence, security conformity for radio equipment will
become part of the CE marking process.

CE Label

CE means that the manufacturer or importer affirms the goods' conformity with
European health, safety, and environmental protection standards. It indicates
that the product may be traded freely in any part of the European Economic
Area, regardless of its country of origin.

The delegated act focusses on 3 essential requirements to improve the security and
privacy of their users, incorporated as 3.(d), 3.(e) and 3.(f):

3.(d). Radio equipment does not harm the network or its functioning nor misuse
network resources thereby causing an unacceptable degradation of service.

3.(e). Radio equipment incorporates safeguards to ensure that the personal data
and privacy of the user and of the subscriber are protected.

3.(f). Radio equipment supports certain features ensuring protection from fraud.

C ompliance to these requirements by the affected manufacturers and their products


will become effective in August 2025. Failing to do so will prevent the manufacturer to
place its products on the EU market.
What will happen with old devices?

The Delegated Act will apply to all devices placed on the market once it becomes
applicable. Old devices, which have already been placed on the EU market, can
continue to be used without the need for specific adaptations until the end of
their life cycle. (EU Press Corner)

REQUIREMENTS 15
In august 2022 the European Commission issued a request for three harmonized
standards to the European Standard Organization CEN/CENELEC to be published
by June 2024.

While definite technical specifications for radio equipment in scope are not yet
available, this request provides insights into the topics that will be covered. The topics
include measures for network monitoring, the mitigation of denial-of-service attacks,
authentication and access control, the absence of known vulnerabilities and support for
secure updates, and controlling the attack surface. Furthermore, effective data secu-
rity and privacy measures, such as the erasure of personal data when decommissioning
a device, are required.

COMPLIANCE
To become compliant with the RED Delegated Act on cybersecurity, as required by
august 2025, manufacturers have three options:
1. When harmonized standards are available
(due June 2024 by CEN/CENELEC),
manufacturers can perform a self-assess-
ment based on these standards.

The other two options require the involvement


of a Notified Body (NB).
2. A notified body performs an EU-type ex-
amination, and the manufacturer must
guarantee and declare internal produc-
tion control.
3. A notified body assesses the quality sys-
tem of the manufacturer who must oper-
ate an approved quality system for design,
manufacturing, inspection, and testing.
Cyber Resilience Act (CRA)

The upcoming Cyber Resilience Act encompasses the security of a broad range of
products performing digital operations. In contrast to the RED Delegated Act, it is
a horizontal act covering both radio and non-radio enabled equipment across various
industries. It aligns with societal developments by shifting from Cybersecurity to
Cyber Resilience.

Annex 1 of the CRA plays a vital role in this framework as it defines essential cyber-
security requirements. These requirements address both the product and the process
with a large emphasis on vulnerability handling.
16
When the Cyber Resilience Act becomes effective, it should replace the specifications
of the RED Delegated Act. Table 1 compares the requirements in both RED-DA and
CRA at a high level. Since no further information is available yet, the comparison is based
on the request for harmonization standards in the context of the RED-DA and Annex 1 of
the CRA.

In summary, the CRA further bolsters the security of devices by requiring secure
default configurations and measures for integrity protection. Furthermore, in contrast
to the RED Delegated Act, we can see that the CRA emphasizes the importance of
the process aspect of cybersecurity. It puts a stronger emphasis on security-by-design
principles and risk assessment ensuring that security considerations are integrated
throughout the entire development lifecycle of digital products.

Additionally, the CRA introduces a comprehensive set of new requirements specifical-


ly related to vulnerability handling. These include the documentation and remediation
of vulnerabilities, security testing, public disclosure practices, the implementation
of a coordinated vulnerability disclosure policy, and establishing a robust reporting
mechanism.

From RED-DA to CRA

Dealing with vulnerabilities becomes more important


Rationale: from cybersecurity to cyberresilience

Integrity protection and secure defaults introduced


Rationale: increasing security requirements

Risk assessment and security-by-design embraced in CRA


Rationale: whole development cycle becomes important (shift left)

No longer focus on intervenience/transparency in CRA


Rationale: refers to sensitive/personal data (CRA aims to be horizontal)
GDPR regulation complements CRA regulation
While RED-DA embraces privacy related provisions, the Cyber Resilience Act no lon-
ger mentions these measures, presumably as they are already handled by the GDPR
regulation. The CRA clearly raises the bar by means of a wider range of measures
striving for a stronger cybersecurity posture in line with the evolving challenges posed
by the digital age.
Table 1: Security requirements in the RED-DA and the CRA

RED-DA CRA
Monitoring ✓ ✓

Mitigation of DoS ✓ ✓

Authentication and Access Control ✓ ✓


17
Secure Updates ✓ ✓

Product Security Data Protection ✓ ✓

No Known Vulnerabilities ✓ ✓

Minimize the Attack Surface ✓ ✓

Secure Defaults ✓

Integrity Protection ✓

Intervenience ✓
Product Privacy
Transparency ✓

Risk Assessment ✓ ✓
Process Security
Security-By-Design ✓

Document & Remediate Vulnerabilities ✓

Security Testing ✓

Public Disclosure ✓

Vulnerability Handling Coordinated Vulnerability Disclosure ✓

Vulnerability Reporting ✓

Properly Disseminate Updates ✓

SBOM ✓

Summary
Governments worldwide, with the European Union leading, have been actively in-
troducing regulations aimed at enhancing the security of digital systems. A process
that initially began as voluntary guidelines has now transitioned into mandatory re-
quirements. Non-compliance with these regulations carries significant consequences
including penalties, restrictions on product sales and substantial loss of revenues for
businesses. Manufacturers are witnessing a noticeable uptick in cybersecurity de-
mands, both in technology and process as underscored with a shift from cybersecurity
to cyber resilience (e.g., focus on vulnerability management, and security-by-design).
18

2.
Representative
Cybersecurity Standards

Guidelines versus Standards

This part takes a deeper dive into multiple standards in the domain of cybersecurity.
We especially focus on cybersecurity standards for connected digital systems and shine
our spotlight to the IEC 62443, ETSI EN 303 645, NIST and Common Criteria stan-
dards. The major focus of those standards is embedded systems that are connected to
the Internet rather than traditional IT systems. Note that other standards apply for
those systems and services. Before the deep-dive into those standards, we summarize
major lessons learnt from applying various standards to multiple case studies in the
domain of industrial control systems.

Guidelines versus Standards. First, it is important to clear out that there is a differ-
ence between guidelines and best practices on the one hand and standards on the other
hand. Although both can support the development, integration and maintenance of
qualitative digital systems, the former are typically postulated by working groups
and applied without strong commitments. Examples are the ENISA and OWASP
guidelines and best practices. Standards
typically undergo a stronger review
process and are actively supported by a Standards are more
broad set of organizations (which can be powerful than guidelines
delegated by governments worldwide). and therefore often applied
However, it is important to mention that
many standards rely on guidelines and to demonstrate compliance
best practices and often synthesize the with upcoming regulations.
valuable information included in guide-
lines. Hence, guidelines often precede
standards. Consequently, standards are
more powerful than guidelines and therefore often applied to demonstrate compli-
ance with upcoming regulations. For instance, harmonized standards are assigned
19
to delegated acts. Compliance with standards (and regulation) is typically done by a
certification process performed by the organization itself or a third party and can result
in a certification label.

Scope of standards. Second, the scope of standards can vary. Some standards mainly
focus on the product that is sold on the market. The ETSI standard for consumer IoT
devices is a prototypical example. Note that the RED Delegated Act mainly targets
the end product. Other standards are broader and embrace both the end product
and development process, or even the overall lifecycle. Both aspects will be of major
importance in the upcoming Cyber Resilience Act. The latter means that requirements
are incorporated that are related to the product, processes related to the whole design
and development process, and to people interacting with the product during various
stages of its lifecycle (from developers over integrators to operators). The IEC stan-
dard is an example that covers a broad scope including requirements with respect to
product development, integration, and operation.

Horizontal versus vertical standards. Third, there are other criteria in which standards
may vary. On the one hand, there are standards that focus on one or a limited number
of sectors. On the other hand, some standards are sector agnostic. There is a tenden-
cy to favor sector agnostic standards. This allows manufacturers to build connected
components and devices that can be applied across multiple sectors. Similarly, the
level of depth of standards may vary. For example, the ETSI standard is by far less
detailed than the IEC standard. Finally, some standards are freely available; for others,
a substantial fee is required to get access to them.

Standards are dynamic. Fourth, standards evolve over time. This is logical as cyber-
attacks and insights evolve over time. For instance, in earlier versions of the IEC
62443-4.2 standard, multi-factor authentication to the device was required to obtain
security level 4 – which is the highest security level – until a couple of years ago where
this strategy must currently be built in to achieve IEC security level three. Similarly,
until some years ago, it was good practice to change passwords at regular intervals
whereas it is currently no longer recommended to do so as password resets have more
recently been shown to be ineffective and make devices less secure.
Four standards are discussed in more detail in the remainder of this part. All of them
target the security of connected digital systems. The IEC 62443 standard is frequent-
ly used in the context of industrial control systems whereas the ETSI EN 303 645
standard is a feasible alternative for consumer IoT devices. NISTIR 8259A and NISTIR
8425 are standards developed by the National Institute of Standards and Technology
(NIST) and hence, mainly applied in the United States. The former is adopted for
consumer IoT devices whereas the latter is applied in governmental settings. Finally,
the Common Criteria is a framework that allows to specify protection profiles for
certain device types, defining the requirements that a device of that type should meet.

IEC 62443: Overview


20
The IEC standard is frequently applied in Industrial Automation and Control Systems
(IACS) and consists of 13 documents classified into four parts. Each document can be
purchased separately and costs a couple of hundreds of euros.

General. The first part consists of


four documents covering general
information that might be useful to
fully grasp the other parts. More spe-
cifically, terminology, general concepts
and a master glossary are included in
document 1-1 and 1-2 respectively.
System security compliance metrics
are proposed in document 1-3. Finally,
document 1-4 presents a typical IACS
security lifecycle and use-cases.

Policy and procedure. The second


part consists of requirements for IACS
operators. These stakeholders are the
organizations that run an IACS system
and are hence, accountable for its
correct functioning. Operators are
split in more fine-grained roles in the
documents 2-1, 2-2 and 2-4. The first one deals with requirements for entities that
interact directly with the IACS system; the second one deals with requirements for
the underlying management systems and people interacting with that system, whereas
the third one deals with service providers exposing functions to third parties (either
consumers or other organizations). Document 2-3 provides guidelines for patch
management. Note that, besides operators, multiple other entities can be involved in
patch management (i.e., the integrator and/or component manufacturer).
System. The third part consists of three documents that provide requirements to in-
tegrators. Integrators are the entities that design, deploy, commission, and/or maintain
an IACS ecosystem. The system part is discussed on more detail below.

Component. The final part consists of two documents that contain requirements for
entities that develop components that can later be integrated into IACS systems. The
two documents later discuss process related requirements regarding the development
cycle (4-1) and cybersecurity technical requirements for components (4-2). Compo-
nents are split into four categories namely embedded devices, network components,
host components and software. The component part is discussed in more detail below.

Table 2: Overview of the IEC 62443 Standard


21
IEC 62443-1-1 IEC 62443-1-2 IEC 62443-1-3 IEC 62443-1-3
General

Terminology, Master glossary System security IACS security


concepts and of terms and compliance lifecycle and
models abbreviations metrics use-cases

IEC 62443-2-1 IEC 62443-2-2 IEC 62443-2-3 IEC 62443-2-4

Implementation
Procedure
Policy and

Security program
Establishing an guidance for an Patch manage-
requirements
IACS security IACS security ment in the IACS
for IACS service
program management environment
Providers
system

IEC 62443-3-1 IEC 62443-3-2 IEC 62443-3-3


System

Security Security risk System security


technologies for assessment and requirements and
IACS system design security levels

IEC 62443-4-1 IEC 62443-4-2


Component

Technical security
Product
requirements
development
for IACS
requirements
components
IEC 62443-3: System Concerns

IEC62443-3-1: Security technologies. This document introduces multiple technolo-


gies that are frequently used or deployed in IACS environments. Each technology is
assessed based on multiple criteria. The security vulnerabilities that are addressed by
the technology together with typical deployment are documented. Even so, known
issues and weaknesses together with an assessment of the use of the technology in
IACS environments are discussed. Finally, the document incorporates recommenda-
tions and guidance for using each technology, and points to information resources and
reference material. 25 technologies are divided across six categories:
• Authentication and authorization technologies. This is the category consisting of
22 the largest subset of technologies. Nine technologies are assessed. Most of them
are authentication technologies (password and location based; biometric, smart
card and physical token authentication; challenge/response protocols and de-
vice-to-device authentication). Furthermore, password management technologies
and role-based authorization are discussed.
• Filtering/blocking/access control technologies. Three technologies belong to this
category. The infrastructural elements are network and host-based firewalls, and
virtual networks.
• Encryption technologies and data validation. This category assesses symmetric
encryption technologies, asymmetric encryption and PKI, and Virtual Private
Networks (VPN).
• Management, audit, measurement, monitoring and detection tools. The category
assesses all kinds of infrastructure to prevent and detect viruses and manage secu-
rity services. Six technologies are discussed: log auditing utilities, virus detection
systems, intrusion detection technologies, forensic tools, automated software tools
and host configuration management tools.
• IACS computer software. This category focuses on the security features of
real-time and embedded operating systems, web technologies and server and
workstation operating systems.
• Physical security controls. Two types of physical security controls are discussed and
assessed, namely physical protection mechanisms and personnel security tactics.
IEC 62443-3-2: Security risk assessment for systems. This part proposes a
structured approach to perform a risk assessment for an Industrial Automation and
Control System (IACS) under design. It consists of six steps and ultimately leads to
documentation of security requirements imposed on each part within the System
under Consideration (SuC). As such, it offers a structured aid to elicit the security
challenges for an IACS system. The steps are listed below:
• Defining the System under Consideration (SuC). This step determines the
boundaries of the system on which the security risk assessment is performed.
For instance, it must be clear up to what extent Bring Your Own Device (BYOD)
devices or relying cloud services are part of the risk assessment.
• Performing an initial risk assessment. The goal of this step is to gain an initial
understanding of the worst-case risk the SuC presents to the organization in case 23
it should be compromised. It is typically evaluated in terms of negative impact to
health, safety, environment, business interruption, production loss, financial, legal,
reputation, etc. This step can rely on a matrix in which the severity and likelihood
of undesired scenarios are weighted. Both criteria result in an estimated risk for
that scenario.
• Partitioning the System under Consideration
into zones and conduits. During this step,
assets are grouped based on the results of
the initial risk assessment and other asset
parameters. The intention of this step is to
identify and group assets that share com-
mon security requirements and permit the
identification of common security measures
to mitigate the risks. Models such as the
Purdue reference model can be used as a
basis for this division. Separated zones can be
defined for the enterprise network (including
the Enterprise Resource Planning systems),
Demilitarized Zone (DMZ) (including servers
that are exposed to the outside world via the
Internet), operations and control (including
Manufacturing Execution Systems), and
process related tasks (like Remote Terminal
Units, sensors and actuators). Note that a zone defines a grouped set of assets, and
that a conduit defines a coupling between two zones.
• Assessing risk for each zone and conduit. A fine-grained risk analysis is performed
for each zone. It consists of identifying the threats and vulnerabilities for each
zone, evaluating the consequences and impact of malfunctioning, and calculating
the cybersecurity risk based on the aforementioned criteria. This is required to
determine the targeted security level for each zone and conduit, together with
proposed countermeasures.
• Documenting the security requirements. The security requirements for each zone
and conduit are finally documented and provide guidance during system design.
IEC 62443-3-3: Security requirements and security levels. This document discusses
the semantics of multiple security levels and defines the security requirements that
must be fulfilled to achieve a security level. Note that security requirements are
split into seven different categories or classes.
Hence, the security level assigned to a zone
or conduit consists of a vector consisting of
seven values that reflect the protection level Security requirements
with respect to cybersecurity requirements are divided into seven
of that class. Those vectors (i.e., the target
security levels) allow to compare the security of
distinct categories, each
zones within an organization or across different of which is assigned a
24
organizations and can be a guide to select specific security level.
countermeasures and IACS devices that will be
rolled out in the IACS system at a later stage.
Five security levels are defined:
• Security level 0 (SL 0) defines that the zone or conduit is not protected at all
against incidents.
• Security level 1 (SL 1) determines that the zone or conduit should be protected
against unintended accidents. An example of an unintended cybersecurity accident
is a password that is sent in clear text over a conduit between two zones allowing a
network engineer to view it while troubleshooting the system.
• Security level 2/3/4 (SL 2/3/4) define that the zone or conduit is protected against
intentional accidents respectively with simple means (SL 2), advanced means (SL 3)
and very advanced means (SL 4). An attacker relying on a virus or exploit obtained
from an illegal website is a simple means whereas an attacker relying on supercom-
puters or computer clusters is a very advanced means.

Note that an action plan must be


defined if the actual security level of
a zone or conduit is lower than the
targeted security level. This means
that additional measures must be taken.
Examples are the modification of the
design of a component, the selection
of another component or the addition
of an additional component. The
system should be reevaluated after
system modifications are made. Note
further that security measures may not
adversely affect essential functions.
25

IEC 62443-4: Component Concerns

IEC 62443-4-1: Secure product development lifecycle requirements. This


document defines requirements for increasing the security level of the Software
Development Lifecycle (SDLC) for product suppliers. These are entities that build
components that will later typically be integrated in an IACS setting. The require-
ments are split into eight categories and provide concrete guidelines to improve the
design, implementation, testing and validation, maintenance, and overall security
management of IACS components. It is recognized that improving the quality of the
SDLC is an evolving process that requires continuous efforts.

Four maturity levels are defined, namely Initial, Managed, Defined/Practiced, and
Improving. The lowest level means that products are developed in an ad hoc and
undocumented manner, which means that consistency across projects is lacking;
the Managed Level determines, amongst others, that evidence can be shown that
personnel have expertise and are trained; the highest-level means that continuous
improvement in these areas can be demonstrated. The documentation further points
to metrics to assess the process. For instance, one metric assigns an assessment
score to the pool of in-house engineers based on the number of completed
assessments and the total number of software engineers. This metric can be used
to assess the security management level. Another metric counts the number of
cleaned compiler flags and the total number of software components and assesses the
security testing and validation quality based on a function of those parameters.
IEC 62443-4-2: Technical security requirements for IACS components. This doc-
ument describes a list of 57 Component Requirements (CRs) and 30 Requirement
Enhancements (REs) that can be imposed to IACS components. The component
requirements can be applied to all types of IACS components and are grouped in
seven classes which are called Foundational Requirements (FRs). The requirements
enhancements only apply to certain types of IACS components. These components
are split into four categories, namely software applications, embedded devices (like
PLCs and intelligent electronic devices), host devices (like operator workstations and
data historians) and network devices (like switches and routers).

Like a zone or conduit, the security level (SL) of an IACS component is a vector that
consists of seven values (each ranging from 0 to 4). A predefined set of component
26 requirements within a Foundational Requirement class must be fulfilled to achieve
a certain security level SL for that Foundational Requirement. The actual security
level (SL-A) defines the current security level of a component whereas the targeted
security level (SL-T) defines the desired security level for that component. If
the targeted security level (SL-T) is higher than the actual security level (SL-A),
additional measures must be taken.

Achieving a certain security level protects the


IACS component against cybersecurity attacks
Security levels are by malicious individuals or organizations with
important from the simple, moderate, or high means (depending on
the specific level that is reached). Security levels
perspective of system -- i.e., the vectors returning values for foundational
integrators as well as requirements -- are important from the perspective
of system integrators as well as product suppliers.
product suppliers. The former can impose a minimal security level
for components of a particular type that will be
integrated into the IACS system. This can depend
on the security risk assessment that is performed in an earlier stage (by means of the
IEC procedures defined in part 3 of the standard). The latter can use the vector and
underlying documentation to demonstrate that a certain security level is reached.

Seven classes of Foundational Requirements exist. The straightforward ones are


System Integrity (FR3), Data Confidentiality (FR4) and Resource Availability
(FR7). These refer to the CIA (Confidentiality – Integrity – Availability) triangle.
Furthermore, Identification and Authentication (FR1) and Use Control (FR2) are
required to support controlled access to the component under study. The remaining
foundational requirements are Restricted Data Flow (FR5) and Timely Response to
events (FR6). The former (i.e., FR5) is essential to control the impact of potential
data breaches; the latter (i.e., FR6) is required to support the correct functioning
even in the presence of cyber-attacks.
Each Foundational Requirement (FR) consists of a set of Component Requirements
(CRs). For instance, 14 CRs are assigned to identification and authentication control
(i.e., Foundational Requirement 1). Human User Identification and Authentication
(CR1.1) and Strength of password-based authentication (CR1.7) are two examples.
The former (CR1.1) defines that all human users on all interfaces capable of human
user access must be identified and authenticated. The latter (CR1.7) defines that for
components that utilize password-based authentication, the capability to enforce
configurable password strength based on minimum length or variety of character
types should be provided. For each Component Requirement (CR), the following
information is provided in the standard:
• A description of the requirement.
• Rationale and guidance to fulfill the requirement, eventually by pointing to other 27
reference material or well-recognized best practices.
• A set of requirement enhancements.
Achieving some of them can increase
the achieved security level for that com-
ponent requirement. For instance, (a)
unique identification/authentication and
(b) multifactor authentication are two
requirement enhancements belonging
to the component requirement Human
User Identification (CR1.1). The former is
required to achieve Security Level 2; the
latter to achieve Security Level 3.
• The achieved security level that is
reached for that requirement if the
requirement and certain requirement
enhancements are met

Note that the IEC 62443 standard is still evolving. The target audience is quite broad.
It provides meaningful information for operators, integrators and manufacturers
of Industrial Automation and Control Systems (IACS). Both process and product
related requirements are incorporated at a high level of detail (including an extensive
overview of relevant technologies, meaningful pointers, good practices and potential
constraints).
ETSI EN 303 645

The ETSI standard defines baseline requirements for the cybersecurity for consumer
Internet-of-Things. The target audience of the ETSI standard are developers and man-
ufacturers of such devices. Examples are thermostats, baby phones, door locks, speak-
ers, cameras, connected home automation, and fridges. Hence, in contrast to the IEC
standard, ETSI offers no or limited protection against sophisticated attacks or attacks
with physical access. The ETSI provisions follow and synthesize the recommendations
of many working groups like the ENISA Baseline Security Recommendations, the IoT
Security Foundation Compliance Framework, and the OWASP Internet-of-Things
recommendations. Over 50 security provisions are classified into 13 categories in the
28 ETSI standard. The categories are listed below:

1. Do not use universal passwords


2. Implement a means to manage reports of vulnerabilities
3. Keep software updated
4. Store sensitive security parameters securely
5. Communicate securely
6. Minimize exposed attack surfaces
7. Ensure software integrity
8. Ensure that personal data is secure
9. Make systems resilient to outages
10. Examine system telemetry data
11. Make it easy for consumers to delete personal data
12. Make installation and maintenance of devices easy
13. Validate input data

The level of detail is by far less than the foundational requirements in the IEC stan-
dard. Even so, it provides smaller guidance to realize each provision compared to IEC
62443. However, the ETSI standard is typically applied in settings with lower risk and
impact. Besides the security provisions, a list of privacy recommendations is included
for consumer devices that are dealing with personal data.

The standard provides a template that can be used to document a device under study.
An entry is preserved for each provision. IoT developers can mark the provision status
and details, thereby explaining how the provision is realized or why it is not fulfilled.
Common Criteria

The Common Criteria standard -- also referred to as IEC 15408 -- is a framework in


which computer system users (i.e., buyers or integrators) can specify their Security
Functional and Assurance Requirements (SFRs and SARs respectively) in a Security
Target (ST). A Security Target is a component (i.e., device or software service) that a
user (or buyer) wants to integrate in its environment.

At the same time, vendors (or sellers or manufacturers) can implement or make claims
about their products' security attributes, and testing laboratories can evaluate the
products to determine if they meet the claims.
29
As specifying the security and assurance requirements from scratch is hard for a
concrete device, they may be taken from a catalogue of Protection Profiles (PPs).
At the time of writing, around 690 different protection profiles exist. Examples are
protection profiles for water meters, routers, smartcards, rail components … They are
typically established by a set of stakeholders with substantial expertise in the domain
and are representative blueprints or templates that are used by both manufacturers
(i.e., vendors) and integrators (i.e., potential buyers). Integrators can require that at
least the security functional and assurance requirements in the blueprint are fulfilled;
similarly, manufacturers will rely on those templates as this will give them the power to
sell their products on a large scale.
NIST

Besides the international standards that are developed, US specific standards also
exist. The most well-known with respect to connected digital systems is the NIST
standard. The standard defines recommended cybersecurity capabilities for connected
embedded devices and is often presented as the US counterpart of the European ETSI
standard. NIST IR 8425 and NIST SP 800-213 define profiles for consumer IoT
devices and governmental IoT devices respectively and consist of six provisions:

1. Provide a unique identity per device


2. Only authorized entities can change device configuration
30
3. Stored and transmitted data must be protected from unauthorized access and
modifications
4. Access to local and network interfaces, protocols and services must be restricted
5. Software and firmware update mechanisms must be provided
6. The cybersecurity state of devices must be reported to authorized parties

The US government announced a national label for consumer IoT cybersecurity. The
label rewards products that meet the NIST standard. The US government keeps a reg-
istry indicating that security has been tested and certified as compliant. Compliance is
currently optional but may provide competitive advantages.

EU member states typically rely on ETSI and IEC standards. Although compliance is
currently optional, it will be mandatory from August 2025 as demonstrating compli-
ance with one of those standards will potentially be mandatory to retrieve the CE label
(cfr. RED-DA).

Other countries like Japan support both EU and US standards.


31
32

3.
Co-engineering
Cybersecurity and Safety

Brief Introduction to Functional Safety and its Standards

Safety of systems deals with protecting humans, both operators and passers-by, and
the environment around those systems. Where that protection depends on the cor-
rect functioning of (programmable) electrical and/or electronic devices, this is called
Functional Safety.

Taking the example of a car, the operator


expects the steering component to function
at all times and only turn when instructed,
and not to unwantedly turn the car into
traffic or pedestrians. Regarding the envi-
ronment, it is unwanted that a car leaks oil,
or produces fumes of unburned fuel or any
other negative impact. Therefore, the parts
that contribute to the functional safety
are heavily regulated. To comply with the
regulations, manufacturers often rely on
standards that describe feasible techniques
to produce safe systems. For automotive systems, this is the ISO 26262: Road Vehi-
cles – Functional Safety. This standard defines guidelines to reduce the risk of failure
in order to guarantee the safety-critical parts of a car execute their tasks correctly and
at the desired time. Often these systems are certified by an independent third party, a
so-called Notified Body (NB), to demonstrate their compliance with the regulations.

Given the importance of functional safety and the widespread use of safety-critical
devices in many domains, many of
them have their own (set of) func-
Process Industry
IEC 61511
tional safety standard(s). As shown
in Figure 1, many domain-specific
standards are derived from the base
Machinery norm IEC 61508: Functional safety 33
IEC 62061 of electrical/electronic/program-
Base Norm mable electronic safety-related
IEC 61508 systems. A domain-specific norm
Railway typically has the same structure as
EN 5012X the base norm, potentially comple-
refers to
mented with domain-specific risk
analysis techniques, documentation
Medical Automotive needs and domain-dependent
IEC 60601 ISO 26262 requirements.
Figure 1: Non-exhaustive overview of functional safety
Overall, functional safety norms
norms and their relation to IEC 61508.
split failures in two categories,
namely (a) systematic failures and
(b) random failures. Each has its own mitigation strategies.

(a) Systematic failures are failures that can be deterministically linked to a cause.
Specifying the wrong type of sensor to use, production errors or programming
mistakes are examples. To mitigate systematic failures, the standards impose
strict processes within a safety lifecycle, that start at the conception of the sys-
tem and end with its decommissioning. This encompasses strict requirements
with respect to procedures, documentation and for every step inside the lifecy-
cle.

(b) Random failures occur due to one or more aging processes. Think of a short-cir-
cuited resistor or a burned-out light bulb. The standards combat these by relying
on fault tolerance techniques, such as the obligated use of a 1-out-of-2 archi-
tecture or the use of error correcting codes when transmitting data. The combi-
nation of the fault tolerance techniques and strict processes minimizes the risk
of failure.
Co-engineering Cybersecurity and Safety

Many safety-critical systems have become a part of increasingly interconnected


systems due to the push towards digitalization. Collaborative robots, medical devices
and autonomous (mobile) systems are examples of safety systems, which are typically
classified as operation technology (OT) systems due to their proximity with the final
product or function that is provided by an organization. This interconnectedness of IT
and OT systems leads to an increase in the attack surface of safety systems, leaving
them exposed to the threat of cyber-attacks.

There is a tremendous increase in attacks on OT


34 infrastructure as they are perceived as low-hang-
The interconnectedness ing fruit by cyber-criminals3. The potential safety
of IT and OT systems consequences and harm that can arise from such
cyber risks to safety systems can be demonstrat-
leads to an increase in ed by several attacks and vulnerabilities. Some of
the attack surface of them are highlighted below:
safety systems • Stuxnet, discovered in 2010, is the first worm
known to attack targeted programmable
logic controllers and other industrial control
systems (ICS).
• In 2014, researchers demonstrated remote compromise of a passenger vehicle
which allowed them to control safety systems, like braking and steering.
• The Cybersecurity and Infrastructure Security Agency (CISA) announced in 2015
security vulnerabilities in a hospital drug pump that could alter the drug dosage
with lethal consequences.
• Triton malware was discovered in a petrochemical plant in 2017, which aimed to
disable safety instrumented systems and led to loss of safety function.
• EKANS ransomware emerged in 2019, which in addition to encrypting files also
featured functionality targeted to kill industrial control system processes.

The developers of such safety systems do not only need to consider safety and cyber-
security concerns, but also examine the cross-domain influences and manage these
risks of harm to an acceptable level throughout the entire lifecycle of the system.
This process is called co-engineering safety and security. While the current versions
of standards across various domains, like industry, automotive, marine and medical,
address the interplay between safety and cybersecurity to varying degrees, the safety
legislation and regulation are already changing to include cyber risks, as indicated in
Part 1.
Therefore, having the right processes and tools in place is vital for continued economic
success and maintaining a competitive advantage within the EU, as well as abroad. In
addition to harm to human life and to society, negligence and non-compliance can
result in strong economic penalties, including the ban of sales and costly claims or
recalls. Generally, the implementation of effective cybersecurity measures and
co-engineering requires the modification of the safety-related systems. Crucial to this
modification is the close interaction between safety and security teams.

The following sections touch upon the various standards and guidance available for
specific domains, the challenges involved and conclude with practical notes on how to
co-engineer safety and security across the entire lifecycle. This chapter intends to be a
quick-read and therefore only summarizes the information from the existing literature
mentioned in the sources. The interested reader is encouraged to read these sources to 35
find more concrete recommendations and guidelines.
Security & Safety Standards and Guidelines

Table 3 lists multiple representative standards that deal with safety, security, and their
interaction across various domains. These standards adopt different approaches to safe-
ty and security assurance. Some of them deal with only one concern and make vague
pointers to the other concern, such as ISO 13849: Safety of machinery. Others, like
the IEC 61508 or ISO 21434: Road vehicles Cybersecurity engineering deal with only
one concern as well but define specific interactions with the other concern throughout
the document. Depending on their focus, these types of norms and guidance are clas-
sified as (i) security-informed safety, like IEC 61508 and ISO 26262, where system
safety is the primary focus, and (ii) safety-informed security, like IEC 62443 and
36 ISO 21434, where security of the system is the major consideration. The table also
incorporates a third category on co-engineering, where both safety and security are
addressed substantially, such as the IEC TR 63069: Industrial-process measurement,
control and automation - Framework for functional safety and security.

Table 3: Standards and guidance on safety, security, and co-engineering for various domains.

Assurance
Security-informed Safety-informed Co-engineering
Safety Security Security and Safety
ISO 27000 series IET Code of Practice:
General IEC 61508 ISO/IEC 15408 Cybersecurity and
NIST SP 800-213A Safety
IEC 61508
Industrial IEC 62443 IEC TR 63069
IEC 61511
Control NIST SP 800-82 ISA TR 84.00.09
ISO 13849
ISO 26262
Automotive ISO 21448 ISO 21434
Domain

ISO TR 4808
AAMI TIR57
ISO 14971 IEC 80001 series
Healthcare &
IEC 60601 series IEC 81001 series
Medical Devices
FDA Safety MDCG 2019-16
FDA Security
CENELEC EN 50126
Railway CENELEC EN 50128 CENELEC TS 50701
CENELEC EN 50129
While combining both concerns into a single, joint assurance is possible, it is not recom-
mended due to the cultural differences between how safety and security teams operate.
For example, the system representations used for respective analysis are different and
attempts to unify safety and security analysis inhibit complete understanding of the
system. There is a risk that unification leads to certain safety and security risks going
unobserved. Therefore, co-engineering norms and guidelines recommend independent
safety and security activities with regular interaction between the safety and security
teams to exchange information.

The activities and processes prescribed for safety and security by the respective
domain-specific norms need to be followed. Resolution of conflicts should be based
on consensus formed by the stakeholders in both domains. While risk prevention and
mitigation are an important cornerstone of ensuring safety and security by design, 37
it alone as a strategy is inadequate. Additional measures during the operation and
maintenance, like fault-resilience, operation monitoring and incident response should
be an active part of the strategy. These strategies need to be verified and validated
to check that the safety and security goals are met. Due to the dynamic nature of
cyber threats, the risks and the adopted strategy need to be reviewed and revised as
necessary throughout the lifecycle of a system.

The following paragraphs are dedicated to providing an overview of the norms and
guidelines that deal with co-engineering of safety and security.
IET CODE OF PRACTICE (COP) CYBER SECURITY AND SAFETY
This is a guidance published by the Institution of Engineering and Technology. It
proposes 15 shared principles for safety and security that span across organizational
structure, governance, processes, competence, and risk management. The shared
principles are listed in Table 4. The shared principles are further elaborated to provide
additional requirements.

For example, principle 7 on the supply chain is further elaborated into 4 requirements,
which detail the controls that need to be in place within the organization to effectively
manage safety and security risks arising from the supply chain. It also includes informa-
tive sections that list the techniques and measures that can be used to apply the Code
38 of Practice (CoP), e.g., referring to various competency frameworks -- like NCSC
certifications for Cybersecurity and HSE Research Report 86 -- and risk analysis
techniques -- like STPA, FTA and SWIFT. Of special importance is Annex D, which
lists the shared principles and provides indicators of good and poor practices. However,
since the CoP is not prescriptive in nature, it does not provide a workflow for managing
safety and security interaction and leaves the implementation of the principles to the
organization.

IEC TR 63069 - FRAMEWORK FOR FUNCTIONAL SAFETY AND


SECURITY OF INDUSTRIAL SYSTEMS
This technical report, published in 2019, provides a framework for the common ap-
plication of IEC 61508 and IEC 62443. The technical report has been drafted for
industrial process measurement, control, and automation, but is generic enough to
be used in other domains in the absence of sector specific norms for co-engineering.

The IEC TR 63069 aims to provide a security environment to ensure an efficiently


protected environment for the operation of the safety and other essential functions.
The security environment is the overall collection of countermeasures. Three high-level
guiding principles are provided, which need to be ensured for safety and security . The
norm places emphasis on the communication and interaction between the safety and
security domains throughout the lifecycle to ensure a suitable security environment
for the essential functions, including safety functions. There is no pre-defined priority
between safety and security that is prescribed by the norm to resolve identified con-
flicts, which need to be resolved through trade-off analysis and the consensus formed
by the stakeholders from both domains.

ISA TR 84.00.09 - CYBERSECURITY RELATED TO THE FUNC-


TIONAL SAFETY LIFECYCLE
Aimed towards industrial automation and control systems, ISA TR 84.00.09 provides
guidance on the integration of cybersecurity into the safety lifecycle. The guidance
states that cybersecurity of instrumented systems must be addressed at every stage of
the safety lifecycle (from conception to decommissioning).
Table 4: Shared principles for safety and security, as seen in IET CoP Cybersecurity and Safety.

Category n° Title
1 Accountability for safety and security of an organization's
operations is held at board level
2 The organization's governance of safety, security and their
Management interaction is defined.
and Governance
3 Demonstrably effective management systems are in place.
4 The level of independence in assurance is proportionate to
the potential harm.
5 The organization promotes an open/learning culture whilst 39
maintaining appropriate confidentiality.
Culture and
Competence 6 Organizations are demonstrably competent to undertake
activities that are critical to achieving security and safety
objectives.
7 The organization manages its supply chain to support the
Supply Chain assurance of safety and security in accordance with its
overarching safety/security strategy.
8 The scope of the system-of-interest, including its boundary
System and interfaces, is defined.
Engineering 9 Safety and security are addressed as coordinated views of
the integrated systems engineering process.
10 The resources expended in safety and security risk manage-
ment, and the required integrity and resilience characteris-
tics, are proportionate to the potential harm.
11 Safety and security assessments are used to inform each
other and provide a coherent solution.
Risk 12 The risks associated with the system-of-interest are identi-
Management fied by considerations including safety and security.
13 System architectures are resilient to faults and attack.
14 The risk justification demonstrates that the safety and
security risks have been reduced to an acceptable level.
15 The safety and security considerations are applied and
maintained throughout the life of the system.
Challenges

There are notable similarities in the security and safety risk mitigation approaches.
Both aim at reducing the chance of undesirable consequences. It is widely acknowl-
edged that when assessing the safety of a system, one must also embrace security,
and conversely, the potential threats to safety should be addressed when assessing its
cybersecurity. However, there are hindrances arising from the novelty of the field and
the subsequent lack of expertise and tool support. These challenges can be classified
into the following two categories:

TECHNICAL FACTORS
40 • Adequacy of the hazard and risk analysis techniques employed.
• Quantitative assessment of cyber risks is challenging because predicting frequency
of an attack, or how likely it is to succeed are not trivial in the presence of an intel-
ligent and motivated adversary.
• Lack of tool support and a fragmented tool chain for various phases makes updating
the lifecycle artifacts challenging.

SOCIO-TECHNICAL FACTORS
• Training of risk practitioners and the optimal coordination between safety and
security teams to arrive at trade-offs.
• Safety and security teams have many similar, but differently defined terms which
can cause confusion.

Practical Implementation Notes

This section aims to provide guidance and practical tips for


organizations to improve upon their existing co-engineer-
ing practices. The tips are organized in increasing order
of the involved complexity and time required to build the
competencies and mature processes. This can help teams
tick off the low hanging fruit and grow into more robust
co-engineering practices over time. The tips assume that
organizations currently have a certain maturity in at least
one of the two concerns. This maturity can be demon-
strated when organizations have a Safety Management
System, an Information Security Management System, or
Secure Development Lifecycle processes in place and are
effectively used. Immediate gains in the co-engineering of
safety and security can be reaped through:
• Safety and security use many similar terms which are
interpreted differently. Therefore, establishing a com-
mon, agreed-upon language and establishing common
objectives for the organization is of utmost importance. IEC TR 63069 provides
pointers to establish this.
• The scope of the system-of-interest and what constitutes the system boundary,
and its interfaces should be clearly defined, such that both safety and security
analysis are based on the same premise.
• Provide job-shadowing and learning on the job challenges for both safety and se-
curity engineers to skill up in other concern. This can be achieved through involving
cybersecurity engineers in safety risk assessments, like HARA and HAZOPS, or
safety engineers in threat modeling exercises, like STRIDE and DREAD.
• Have exchanges between the safety and security teams at regular intervals. These
intervals could be decided based on the safety and security goals.
• During risk identification phase and in the absence of trustworthy quantitative data
41
on the likelihood, always use the potential severity of undesirable outcome to guide
risk prioritization. Attempting to reason about likelihood or ease of avoidance at this
stage might lead to wrong prioritization and mislead the subsequent activities, since
cyber threats that seem improbable today might become a credible threat sooner
than one can foresee. The rationalization and acceptance of the residual risk using
qualitative metrics should only be performed at a stage when concrete measures are
put in place to mitigate risks.
• Timebox and perform the analysis proportional to the impact/severity of the risk.
Accept that models can never be 100% correct and focus on their usefulness. Value
driven approaches use estimates like story points, which are similar to agile story
points, and assign an estimate to how catastrophic the breakage of a given story or
functionality would be. Such estimates can be used to decide between the time-
boxed and full safety and security analysis.
• Depending on the type of analysis and the lifecycle phase, adopt the right abstrac-
tion level to produce a consistent set of hazards, risks, and threats. This makes their
linkage and traceability easier to maintain.
• Maintain traceability links between safety and security artifacts. For example,
include the threats which cause a loss of safety using adequate methods. Similarly,
the safety requirements and security requirements which deal with a similar system
aspect, like timing, or resource usage, or error detection, etc., can be related to
each other. Such linkages help in assessing the impact of changes in one concern
upon the other.

Conclusion

In conclusion, the digital revolution has reshaped our lives and workplaces, creating
both opportunities and threats. The European Union has responded with mandatory
cybersecurity regulations and standards to ensure safety and resilience against the
growing cyber threat. This document provides insights into these evolving require-
ments and the need to integrate safety and cybersecurity to protect digital systems in
our interconnected world.
42
Closing Notes

This brochure was made as the final deliverable of the VLAIO TETRA project CoAs-
surance – Integratie van cybersecurity en safety standaarden bij de ontwikkeling van
digitale systemen. This project was executed by project partners DistriNet@Gent
and M-Group. More information: https://www.researchportal.be/nl/project/inte-
gratie-van-cybersecurity-en-safety-standaarden-bij-de-ontwikkeling-van-digitale

Imec-DistriNet is an international research group with extensive expertise in secure


and distributed software. The research in software development in both domains
covers a wide range of topics such as middleware for cloud computing, internet
architectures, network and software security, and embedded systems. DistriNet is 43
part of the Department of Computer Science at KU Leuven, and currently com-
prises around a hundred researchers. DistriNet@Gent counts 18 researchers.
The group has a tradition of conducting application-driven research in close collabo-
ration with industry. Currently, the group is involved in 35 national and international
research projects, ranging from fundamental and strategic basic research to applied
research. More information about past and current research projects at DistriNet@
Gent can be found on https://iiw.kuleuven.be/onderzoek/distrinet

The M-Group research team at the KU Leuven Bruges campus is unique in that it
combines three departments: Mechanical Engineering, Electrical Engineering, and
Computer Science. This makes the group highly skilled and experienced in mecha-
tronics, drawing on the three main disciplines that make up this field with a strong
focus on exploring the reliability of highly automated interconnected mechatronic
systems. The computer science branch focuses on many topics within the tracks of
sensor networks and algorithms, engineering and reliability of embedded software, and
machine learning for industrial applications. The M-Group comprises of 8 professors
with over 35 junior researchers/PhD students, five post-docs, a research manager, a
project manager, and two ATP members for direct services to the industry. The team
has a strong track record in research projects ranging from pure bilateral projects with
the industry to multi-partner EU-funded projects, including coordination of several
ongoing MSCA doctoral networks. A full overview of the projects and publication is
available at the website: https://iiw.kuleuven.be/onderzoek/m-group. The knowledge
of M-Group has also resulted in the CoMoveIT spin-off.
Disclaimer

The purpose of this brochure is to aid enterprises with the integration of cybersecurity
and safety standards when developing digital systems, and to inform these enterprises
of existing and upcoming EU regulations.

The information in this brochure is made to be generic and cannot be regarded as, nor
replace any, legal advice. While every care was taken to draft this brochure as correct
and accurate as possible, it is possible the contained information is not applicable to
your specific situation, is incomplete, wrong or out-of-date, and/or does not match the
position a court of justice or any authority could take.
44
As an enterprise it is solely your own responsibility to comply with all legal requirements
and to inform yourself towards how this is to be done in your concrete situation. The
authors and publishers of this brochure bear no legal responsibility whatsoever for the
compliance with the applicable legal requirements by your enterprise.

Sources and References

Used sources and references can be found on https://iiw.kuleuven.be/onderzoek/


distrinet/CoAssurance.

Authors:
Jorn Lapon
Vincent Naessens
Jens Vankeirsbilck
Sanketh Ramachandra
Jeroen Boydens

Design: Ilse Bohé - based on previous work by Marieke Van Raes

You might also like