Space product assurance - Human dependability handbook

The handbook defines the principles and processes of human dependability as integral part of system safety and dependability. The handbook focuses on human behaviour and performance during the different operation situations as for example in a control centre such as handover to routine mission operation, routine mission operation, satellite maintenance or emergency operations.
This handbook illustrates the implementation of human dependability in the system life cycle, where during any project phase there exists the need to systematically include considerations of the:
- Human element as part of the space system,
- Impact of human behaviour and performance on safety and dependability.
Within this scope, the main application areas of the handbook are to support the:
a.   Development and validation of space system design during the different project phases,  
b.   Development, preparation and implementation of space system operations including their support such as the organisation, rules, training etc.
c.   Collection of human error data and investigation of incidents or accidents involving human error.
The handbook does not address:
- Design errors: The handbook intends to support design (and therefore in this sense, addresses design errors) regarding the avoidance or mitigation of human errors during operations. However, human error during design development are not considered.
- Quantitative (e.g. probabilistic) analysis of human behaviour and performance: The handbook does not address probabilistic assessment of human errors as input to system level safety and dependability analysis and consideration of probabilistic targets, and
- Intentional malicious acts and security related issues: Dependability and safety deals with "threats to safety and mission success" in terms of failures and human non malicious errors and for the sake of completeness includes "threats to safety and mission success" in terms of malicious actions, which are addressed through security risk analysis. However by definition "human dependability" as presented in this handbook excludes the consideration of "malicious actions" and security related issues i.e. considers only "non-malicious actions" of humans.
The handbook does not directly provide information on some disciplines or subjects, which only indirectly i.e. at the level of PSFs (see section 5) interface with "human dependability". Therefore the handbook does not provide direct support to "goals" such as:
- optimize information flux in control room during simulations and critical operations,
- manage cultural differences in a team,
- cope with negative group dynamics,
- present best practices and guidelines about team training needs and training methods,
- provide guidelines and best practices concerning planning of shifts,
- present basic theory about team motivation, and
- manage conflict of interests on a project.

Raumfahrtproduktsicherung - Handbuch zur menschlichen Zuverlässigkeit

Assurance produit des projets spatiaux - Guide sur le facteur humain

Zagotavljanje kakovosti proizvodov v vesoljski tehniki - Priročnik o človekovi zanesljivosti

Ta priročnik določa načela in postopke človekove zanesljivosti kot sestavni del varnosti in zanesljivosti sistemov. Priročnik se osredotoča na človekovo vedenje in zmogljivost v različnih situacijah, na primer v nadzornem centru (kot je prehod na rutinske postopke v okviru misije, izvajanje rutinskih postopkov v okviru misije, vzdrževanje satelitov ali izvajanje postopkov v sili).
V tem priročniku je predstavljeno izvajanje človekove zanesljivosti v življenjskem ciklu sistema, kadar je treba v kateri koli fazi projekta sistematično upoštevati:
– človeški dejavnik kot del vesoljskega sistema;
– vpliv človeškega vedenja in zmogljivosti na varnost ter zanesljivost.
Priročnik se v tem smislu uporablja predvsem kot podpora za:
a.   razvoj in potrjevanje načrta vesoljskega sistema v različnih fazah projekta;  
b.   razvoj, pripravo in izvajanje postopkov vesoljskega sistema, vključno z njihovo podporo (npr. organizacija, pravila, usposabljanje itd.);
c.   zbiranje podatkov o človeških napakah in preiskovanje incidentov ali nesreč, ki vključujejo človeške napake.
Ta priročnik ne obravnava:
– napak pri načrtovanju: namen priročnika je podpora pri načrtovanju (v tem smislu torej obravnava napake pri načrtovanju) v zvezi s preprečevanjem ali zmanjševanjem človeških napak med izvajanjem postopkov, vendar človeške napake med razvojem načrtovanja niso upoštevane;
– kvantitativnih (npr. verjetnostnih) analiz človeškega vedenja in zmogljivosti: priročnik ne obravnava verjetnostne ocene človeških napak kot vhodnega podatka za analizo varnosti in zanesljivosti na ravni sistema ter upoštevanje verjetnostnih ciljev; in
– namernih zlonamernih dejanj in težav, povezanih z varnostjo: priročnik na področju zanesljivosti in varnosti obravnava »grožnje za varnost in uspešnost misije« v smislu napak in človeških nezlonamernih napak ter zavoljo celovitosti vključuje »grožnje za varnost in uspešnost misije« v smislu zlonamernih dejanj, ki so obravnavane z analizo varnostnega tveganja. Vendar v skladu z opredelitvijo »človekova zanesljivost«, kot je predstavljena v tem priročniku, ne upošteva »zlonamernih dejanj« in težav, povezanih z varnostjo, tj. upošteva zgolj človeška »nezlonamerna dejanja«.
V tem priročniku niso neposredno vključene informacije o nekaterih disciplinah ali temah, ki so s »človekovo zanesljivostjo« povezane zgolj posredno, tj. na ravni PSF (glej razdelek 5). Priročnik zato ne zagotavlja neposredne podpore za »cilje«, kot so:
– optimizacija pretoka informacij v nadzorni sobi med simulacijami in kritičnimi postopki;
– obvladovanje kulturnih razlik znotraj ekipe;
– obvladovanje negativne skupinske dinamike;
– predstavitev najboljših praks in smernic v zvezi s potrebami po usposabljanju ekipe ter metodami usposabljanja;
– zagotavljanje smernic in najboljših praks v zvezi z načrtovanjem izmen;
– predstavitev osnovne teorije o motiviranju ekipe; ter
– obvladovanje navzkrižja interesov pri posameznem projektu.

General Information

Status
Published
Public Enquiry End Date
20-Oct-2021
Publication Date
19-Dec-2021
Technical Committee
Current Stage
6060 - National Implementation/Publication (Adopted Project)
Start Date
08-Dec-2021
Due Date
12-Feb-2022
Completion Date
20-Dec-2021

Buy Standard

Technical report
TP CEN/TR 17602-30-03:2022 - BARVE
English language
69 pages
sale 10% off
Preview
sale 10% off
Preview
e-Library read for
1 day
Draft
kTP FprCEN/TR 17602-30-03:2021 - BARVE
English language
69 pages
sale 10% off
Preview
sale 10% off
Preview
e-Library read for
1 day

Standards Content (Sample)

SLOVENSKI STANDARD
SIST-TP CEN/TR 17602-30-03:2022
01-februar-2022
Zagotavljanje kakovosti proizvodov v vesoljski tehniki - Priročnik o človekovi
zanesljivosti
Space product assurance - Human dependability handbook
Raumfahrtproduktsicherung - Handbuch zur menschlichen Zuverlässigkeit
Assurance produit des projets spatiaux - Guide sur le facteur humain
Ta slovenski standard je istoveten z: CEN/TR 17602-30-03:2021
ICS:
03.120.99 Drugi standardi v zvezi s Other standards related to
kakovostjo quality
49.140 Vesoljski sistemi in operacije Space systems and
operations
SIST-TP CEN/TR 17602-30-03:2022 en,fr,de
2003-01.Slovenski inštitut za standardizacijo. Razmnoževanje celote ali delov tega standarda ni dovoljeno.

---------------------- Page: 1 ----------------------
SIST-TP CEN/TR 17602-30-03:2022

---------------------- Page: 2 ----------------------
SIST-TP CEN/TR 17602-30-03:2022


TECHNICAL REPORT CEN/TR 17602-30-03

RAPPORT TECHNIQUE

TECHNISCHER BERICHT
December 2021
ICS 49.140

English version

Space product assurance - Human dependability handbook
Assurance produit des projets spatiaux - Guide sur le Raumfahrtproduktsicherung - Handbuch zur
facteur humain menschlichen Zuverlässigkeit


This Technical Report was approved by CEN on 22 November 2021. It has been drawn up by the Technical Committee
CEN/CLC/JTC 5.

CEN and CENELEC members are the national standards bodies and national electrotechnical committees of Austria, Belgium,
Bulgaria, Croatia, Cyprus, Czech Republic, Denmark, Estonia, Finland, France, Germany, Greece, Hungary, Iceland, Ireland, Italy,
Latvia, Lithuania, Luxembourg, Malta, Netherlands, Norway, Poland, Portugal, Republic of North Macedonia, Romania, Serbia,
Slovakia, Slovenia, Spain, Sweden, Switzerland, Turkey and United Kingdom.
























CEN-CENELEC Management Centre:
Rue de la Science 23, B-1040 Brussels
© 2021 CEN/CENELEC All rights of exploitation in any form and by any means Ref. No. CEN/TR 17602-30-03:2021 E
reserved worldwide for CEN national Members and for
CENELEC Members.

---------------------- Page: 3 ----------------------
SIST-TP CEN/TR 17602-30-03:2022
CEN/TR 17602-30-03:2021 (E)
Table of contents
European Foreword . 5
Introduction . 6
References . 6
1 Scope and objectives . 8
1.1 Scope . 8
1.2 Objectives . 9
2 References . 10
3 Terms, definitions and abbreviated terms . 11
3.1 Terms from other standards . 11
3.2 Terms specific to the present handbook . 11
3.3 Abbreviated terms. 13
4 Objectives of human dependability . 14
5 Principles of human dependability . 15
5.1 Human dependability concept . 15
5.1.1 Introduction . 15
5.1.2 Failure scenario integrating human errors . 16
5.1.3 Human error and error type . 16
5.1.4 Error precursors and error mitigators. 16
5.2 Human role in the system . 24
5.2.1 Overview . 24
5.2.2 Human contribution to safety and mission success . 24
5.2.3 Fundamental principles driving function allocation. 25
5.2.4 Some principles driving user interfaces design . 26
5.2.5 Automated processes and operator tasks in space systems . 28
5.3 References . 29
6 Human dependability processes . 31
6.1 General . 31
6.2 Human error analysis . 32
6.2.1 Objectives of human error analysis . 32
6.2.2 Principles of human error analysis . 33
2

---------------------- Page: 4 ----------------------
SIST-TP CEN/TR 17602-30-03:2022
CEN/TR 17602-30-03:2021 (E)
6.2.3 Human error analysis process . 37
6.3 Human error reporting and investigation . 41
6.3.1 Objectives of human error reporting and investigation . 41
6.3.2 Principles of human error reporting and investigation . 41
6.3.3 Human error reporting and investigation process . 43
6.4 References . 45
7 Implementation of human dependability in system life cycle . 46
7.1 General . 46
7.2 Human dependability activities in project phases . 47
7.2.1 Overview . 47
7.2.2 Phase A: Feasibility . 47
7.2.3 Phase B: Preliminary Definition . 48
7.2.4 Phase C: Detailed Definition . 49
7.2.5 Phase D: Qualification and Production . 50
7.2.6 Phases: E Operations/Utilization and F Disposal . 52
7.3 References . 53
Annex A (informative) Human error analysis data - examples . 54
A.1 Overview . 54
A.2 Examples of the Evolution of PSFs . 55
A.3 Examples of Human Error Scenario Data . 58
A.4 References . 58
Annex B (informative) Human error analysis documentation . 59
Annex C (informative) Human error analysis example questions . 61
C.1 Examples of questions to support a risk analysis on anomalies and human
error during operations . 61
C.2 References . 63
Annex D (informative) Human dependability in various domains . 64
D.1 Human dependability in industrial sectors . 64
D.2 References . 66
Bibliography . 68

Figures
Figure 5-1: Examples of human error in failure scenarios . 16
Figure 5-2: Error precursors, error mitigators and human error in failure scenarios . 17
Figure 5-3: HFACS model . 20
3

---------------------- Page: 5 ----------------------
SIST-TP CEN/TR 17602-30-03:2022
CEN/TR 17602-30-03:2021 (E)
Figure 5-4: Levels of human performance . 21
Figure 5-5: Basic error types . 23
Figure 5-6: MABA-MABA principle . 25
Figure 5-7: Small portion of Chernobyl nuclear power plant control room (from
http://www.upandatom.net/Chernobyl.htm) . 26
Figure 5-8: Example of a computer-based, concentrated control room (Large Hadron
Collider at CERN) . 27
Figure 5-9. Example of a computer-based, concentrated user interface – the glass
cockpit (transition to glass cockpit for the Boeing 747) . 27
Figure 6-1: Human error reduction examples . 33
Figure 6-2: Human error analysis and reduction process . 37
Figure 6-3: Human error analysis iteration . 41
Figure 6-4: Human error reporting and investigation process . 44
Figure 7-1: Human dependability in system life cycle . 46

Tables
Table A-1 : SPAR_H PSF modelling considerations for MIDAS [57] . 55
Table B-1 : Example of an “Human Error Analysis Form sheet” . 60
Table D-1 : Examples of Comparable External Domains . 65


4

---------------------- Page: 6 ----------------------
SIST-TP CEN/TR 17602-30-03:2022
CEN/TR 17602-30-03:2021 (E)
European Foreword
This document (CEN/TR 17602-30-03:2021) has been prepared by Technical Committee
CEN/CLC/JTC 5 “Space”, the secretariat of which is held by DIN.
It is highlighted that this technical report does not contain any requirement but only collection of data
or descriptions and guidelines about how to organize and perform the work in support of EN 16602-
30.
This Technical report (CEN/TR 17602-30-03:2021) originates from ECSS-Q-HB-30-03A.
Attention is drawn to the possibility that some of the elements of this document may be the subject of
patent rights. CEN [and/or CENELEC] shall not be held responsible for identifying any or all such
patent rights.
This document has been prepared under a mandate given to CEN by the European Commission and
the European Free Trade Association.
This document has been developed to cover specifically space systems and has therefore precedence
over any TR covering the same scope but with a wider domain of applicability (e.g.: aerospace).
5

---------------------- Page: 7 ----------------------
SIST-TP CEN/TR 17602-30-03:2022
CEN/TR 17602-30-03:2021 (E)
Introduction
Space systems always have “human in the loop” such as spacecraft operators in a control centre, test
or maintenance staff on a ground or astronauts on board.
Human dependability complements disciplines that concern the interaction of the human element
with or within a complex sociotechnical system and its constituents and processes such as human
factors engineering (see ECSS-E-ST-10-11C “Human factors engineering” [1]), human systems
integration [2], human performance capabilities, human-machine interaction and human-computer
interaction in the space domain [3],[4].
Human dependability captures the emerging consensus and nascent effort in the space sector to
systematically include the considerations of “human behaviour and performance” in the design,
validation and operations of both crewed and un-crewed systems to take benefit of human capabilities
and to prevent human errors. Human behaviour and performance can be influenced by various
factors, also called precursors (e.g. performance shaping factors), resulting in human errors, or error
mitigators, limiting the occurrence or impact of human errors. Human errors can originate from
inadequate system design i.e. that ignores or does not properly account for human factor engineering
and system operation. Human errors can contribute to or be part of failure or accident scenarios
leading to undesirable consequences on a space mission such as loss of mission or as worst case loss of
life.
In the space domain, human dependability as a discipline first surfaced during contractor study and
policy work in the early 1990s in the product assurance, system safety and knowledge management
domain [5],[6] and concerned principles and practices to improve the safety and dependability of
space systems by focusing on human error, related design recommendations and root cause analysis
[7],[8].
The standards ECSS-Q-ST-30C “Dependability”[9] and ECSS-Q-ST-40C”Safety” [10] define principles
and requirements to assess and reduce safety and dependability risks and address aspects of human
dependability such as human error failure tolerance and human error analysis to complement FMECA
and hazard analysis. The objective of human error analysis is to identify, assess and reduce human
errors involved failure scenarios and their consequences. Human error analysis can be implemented
through an iterative process, with iterations being determined by the project progress through the
different project phases. Human error analysis is not to be seen as the conclusion of an investigation,
but rather as a starting point to ensure safety and mission operations success.
The main focus of the handbook is on human dependability associated with humans directly involved
in the operations of a space system (“humans” understood here as individual human operator or
astronaut or groups of humans i.e. e.g. a crew, a team or an organization including AIT (assembly,
integration and test)and launch preparation). This includes and concerns especially the activities
related to the planning and implementation of space system control and mission operations from
launch to disposal, and can be extended to cover operations such as AIT and launch preparation.
References
[1] ECSS-E-ST-10-11C - Space engineering - Human factors engineering, 31 July 2008 (Number of EN
version: EN 16603-10-11)
[2] Booher, Harold R. (Ed.) (2003) Handbook of Human Systems Integration. New York: Wiley.
6

---------------------- Page: 8 ----------------------
SIST-TP CEN/TR 17602-30-03:2022
CEN/TR 17602-30-03:2021 (E)
[3] NASA (2010) Human Integration Design Handbook NASA/SP-2010-3407 (Baseline).
Washington, D.C.: NASA.
[4] NASA (2011) Space Flight Human-System Standard Vol. 2: Human Factors, Habitability, and
Environmental Health NASA-STD-3001,Vol. 2. Washington, D.C.: NASA.
[5] Atkins, R. K. (1990) Human Dependability Requirements, Scope and Implementation at the
European Space Agency. Proceedings of the Annual Reliability and Maintainability
Symposium, IEEE, pp. 85-89.
[6] Meaker, T. A. (1992) Future role of ESA R&M assurance in space flight operation. Proceedings
of the Annual Reliability and Maintainability Symposium, IEEE, pp. 241-242.
[7] Alenia Spazio (1994) Human Dependability Tools, Techniques and Guidelines: Human Error
Avoidance Design Guidelines and Root Cause Analysis Method (SD-TUN-AI-351, -353, -351).
Noordwijk: ESTEC.
[8] Cojazzi, G. (1993) Root Cause Analysis Methodologies: Selection Criteria and Preliminary
Evaluation, ISEI/IE/2443/93, JRC Ispra, Italy: Institute for System Engineering and Informatics.
[9] ECSS-Q-ST-30 – Space product assurance - Dependability, 6 March 2009 (Number of EN version:
EN 16602-30)
[10] ECSS-Q-ST-40 – Space product assurance - Safety, 6 March 2009 (Number of EN version:
EN 16602-40)
7

---------------------- Page: 9 ----------------------
SIST-TP CEN/TR 17602-30-03:2022
CEN/TR 17602-30-03:2021 (E)
1
Scope and objectives
1.1 Scope
The handbook defines the principles and processes of human dependability as integral part of system
safety and dependability. The handbook focuses on human behaviour and performance during the
different operation situations as for example in a control centre such as handover to routine mission
operation, routine mission operation, satellite maintenance or emergency operations.

This handbook illustrates the implementation of human dependability in the system life cycle, where
during any project phase there exists the need to systematically include considerations of the:
 Human element as part of the space system,
 Impact of human behaviour and performance on safety and dependability.
Within this scope, the main application areas of the handbook are to support the:
a. Development and validation of space system design during the different project phases,
b. Development, preparation and implementation of space system operations including their
support such as the organisation, rules, training etc.
c. Collection of human error data and investigation of incidents or accidents involving human error.

The handbook does not address:
 Design errors: The handbook intends to support design (and therefore in this sense, addresses
design errors) regarding the avoidance or mitigation of human errors during operations.
However, human error during design development are not considered.
 Quantitative (e.g. probabilistic) analysis of human behaviour and performance: The handbook
does not address probabilistic assessment of human errors as input to system level safety and
dependability analysis and consideration of probabilistic targets, and
 Intentional malicious acts and security related issues: Dependability and safety deals with
“threats to safety and mission success” in terms of failures and human non malicious errors and
for the sake of completeness includes “threats to safety and mission success” in terms of
malicious actions, which are addressed through security risk analysis. However by definition
“human dependability” as presented in this handbook excludes the consideration of “malicious
actions” and security related issues i.e. considers only “non-malicious actions” of humans.

The handbook does not directly provide information on some disciplines or subjects, which only
indirectly i.e. at the level of PSFs (see section 5) interface with “human dependability”. Therefore the
handbook does not provide direct support to “goals” such as:
 optimize information flux in control room during simulations and critical operations,
 manage cultural differences in a team,
 cope with negative group dynamics,
8

---------------------- Page: 10 ----------------------
SIST-TP CEN/TR 17602-30-03:2022
CEN/TR 17602-30-03:2021 (E)
 present best practices and guidelines about team training needs and training methods,
 provide guidelines and best practices concerning planning of shifts,
 present basic theory about team motivation, and
 manage conflict of interests on a project.
1.2 Objectives
The objectives of the handbook are to support:
 Familiarization with human dependability (see section 5 “principles of human dependability”).
For details and further reading see listed “references” at the end of each section of the
handbook.
 Application of human dependability; (see section 6 “human dependability processes“ and 7
“implementation of human dependability in system life cycle“).

9

---------------------- Page: 11 ----------------------
SIST-TP CEN/TR 17602-30-03:2022
CEN/TR 17602-30-03:2021 (E)
2
References
Due to the structure of the document, each section includes at its end the references called in it.
The Bibliography at the end of this document contains a list of recommended literature.
10

---------------------- Page: 12 ----------------------
SIST-TP CEN/TR 17602-30-03:2022
CEN/TR 17602-30-03:2021 (E)
3
Terms, definitions and abbreviated terms
3.1 Terms from other standards
a. For the purpose of this document, the terms and definitions from ECSS-S-ST-00-01 apply
b. For the purpose of this document, the terms and definitions from ECSS-Q-ST-40 apply, in
particular the following term:
1. operator error
3.2 Terms specific to the present handbook
3.2.1 automation
design and execution of functions by the technical system that can include functions resulting from
the delegation of user’s tasks to the system
3.2.2 error mitigator
set of conditions and circumstances that influences in a positive way human performance and the
occurrence of a human error
NOTE The conditions and circumstances are best described by the
performance shaping factors and levels of human performance.
3.2.3 error precursor
set of conditions and circumstances that influences in a negative way human performance and the
occurrence of a human error
3.2.4 human dependability
performance of the human constituent element and its influencing factors on system safety, reliability,
availability and maintainability
3.2.5 human error
inappropriate or undesirable observable human behaviour with potential impact on safety,
dependability or system performance
NOTE Human behaviour can be decomposed into perception, analysis,
decision and action.
3.2.6 human error analysis
systematic and documented process of identification and assessment of human errors, and analysis
activities supporting the reduction of human errors
11

---------------------- Page: 13 ----------------------
SIST-TP CEN/TR 17602-30-03:2022
CEN/TR 17602-30-03:2021 (E)
3.2.7 human error reduction
elimination, or minimisation and control of existing or potential conditions for human errors
NOTE The conditions and circumstances are best described by the
performance shaping factors and levels of human performance.
3.2.8 human error type
classification of human errors into slips, lapses or mistakes
NOTE The types of human error are described in section 6.2 on the human
dependability concept.
3.2.9 level of human performance
categories of human performances resulting from human cognitive, perceptive or motor behaviour in
a given situation
NOTE 1 As an example, categories of human performances can be "skill
based", "rule based" and "knowledge based".
NOTE 2 The level of human performance results from the combination of the
circumstances and current situation (e.g. routine situation, trained
situation, novel situation) and the type of control of the human action
(e.g. consciously or automatically).
3.2.10 operator-centred design
approach to human-machine system design and development that focuses, beyond the other technical
aims, on making systems usable
3.2.11 performance shaping factor
specific error precursor or error mitigator that influences human performance and the likelihood of
occurrence of a human error.
NOTE 1 Performance shaping factors are either error precursors or error
mitigators appearing in a failure scenario and enhance or degrade
human performance.
NOTE 2 Different performances shaping factors are listed in section 5 of this
document.
3.2.12 resilience
ability to anticipate and adapt to the potential for “surprise and error” in complex sociotechnical
systems
NOTE Resilience engineering provides a framework for understanding and
addressing the context of failures i.e. as a symptom of more in-depth
structural problems of a system.
3.2.13 socio-technical system
holistic view of the system including the operators, the organization in which the operator is involved
and the technical system operated.
NOTE A socio-technical system is the whole structure including
administration, politics, economy and cultural ingredients of an
organisation or a project.
12

---------------------- Page: 14 ----------------------
SIST-TP CEN/TR 17602-30-03:2022
CEN/TR 17602-30-03:2021 (E)
3.3 Abbreviated terms
For the purpose of this document, the following abbreviated terms apply:
Abbreviation Meaning
AIT assembly, integration and testing
FDIR failure detection, isolation and recovery
FMEA failure mode effect analysis
FMECA failure mode effect and criticality analysis
HET human error type
HFACS Human Factors Analysis and Classification System
HMI human-machine interface
HUDEP human dependability
LHP level of human performance
O&M organizational and management
PIF performance influencing factors
PSF performance shaping factor
RAMS reliability, availability, maintainability and safety
SRK skill, rule, knowledge
VACP visual, auditory, cognitive and psychomotor
13

---------------------- Page: 15 ----------------------
SIST-TP CEN/TR 17602-30-03:2022
CEN/TR 17602-30-03:2021 (E)
4
Objectives of human dependability
Objectives of human dependability during the space system life cycle include:
 Definition of the role and involvement of the human in the system, for example support
selection of an automation strategy, enhancement of system resilience due to operator
intervention to prevent incidents or accidents;
 Definition and verification of human dependability requirements on a project such as human
error tolerance as part of overall failure tolerance;
 Assessment of safety and dependability risks of a system from design to operation with respect
to human behaviour and performance, and identification of their positive and negative
contributions to safety and mission success;
 Identification of failure scenarios involving human errors through “human error analysis” as
input to safety and dependability analysis and as basis of a “human error reduction”;
 Definition of human error reduction means to drive the definition and implementation of, for
example system design and operation requirements, specifications, operation concepts,
operation procedures, and operator training requirements;
 Support of the development of operation procedures for normal, emergency and other
conditions with respect to human performance and behaviour;
 Collection and reporting of human error data; and
 Support of the investigation of failure scenarios involving human errors.
14

---------------------- Page: 16 ----------------------
SIST-TP CEN/TR 17602-30-03:2022
CEN/TR 17602-30-03:2021 (E)
5
Principles of human dependability
5.1 Human dependability concept
5.1.1 Introduction
Human dependability is based on the following concept. When a human operates a system, human
capabilities (skills and knowledge) are exploited. These capabilities have the potential to mitigate
expected and unexpected undesired system behaviour, however they can also introduce human errors
causing or contributing to failure scenarios of the system.
In a first step, for safety and dependability analysis at functional analysis level there is no need to
discriminate how functions are implemented i.e. using hardware, software or humans. Indeed
functional failures (loss or degradation of functions) are identified and the associated consequences
with their consequence severities are determined.
However, in further steps, at lower level of safety and dependability analysis such as FMECA on the
physical design the fact that the function is implemented by hardware or software is considered (see
section 6). Similarly, when operations are analysed, the interaction of the human with the rest of the
system is investigated.
Technical failures and human errors are inevitable in complex systems. The human performance and
the technical system can be seen as functioning as a joint cognitive system [11] with three human
relationships:
 Human - environment (technical system),
 Human - human (operating team), and
 Human - itself (work orientation, motivation) and the associated outcome in terms of human
performance and the overall socio-technical system one.
In order to prevent or mitigate as much as possible human errors systems need to be designed taking
into account human factors engineering considerations (see ECSS-E-ST-10-11C “Human factors
engineering” [12]). Proper consideration of human error to achieve the design and operation of a safe
and dependable system is based on analysis of human performance and acknowledges the fact
...

SLOVENSKI STANDARD
kSIST-TP FprCEN/TR 17602-30-03:2021
01-oktober-2021
Zagotavljanje kakovosti proizvodov v vesoljski tehniki - Priročnik o človekovi
zanesljivosti
Space product assurance - Human dependability handbook
Raumfahrtproduktsicherung - Handbuch zur menschlichen Zuverlässigkeit
Assurance produit des projets spatiaux - Guide sur le facteur humain
Ta slovenski standard je istoveten z: FprCEN/TR 17602-30-03
ICS:
03.120.99 Drugi standardi v zvezi s Other standards related to
kakovostjo quality
49.140 Vesoljski sistemi in operacije Space systems and
operations
kSIST-TP FprCEN/TR 17602-30-03:2021 en,fr,de
2003-01.Slovenski inštitut za standardizacijo. Razmnoževanje celote ali delov tega standarda ni dovoljeno.

---------------------- Page: 1 ----------------------
kSIST-TP FprCEN/TR 17602-30-03:2021

---------------------- Page: 2 ----------------------
kSIST-TP FprCEN/TR 17602-30-03:2021


TECHNICAL REPORT
FINAL DRAFT
FprCEN/TR 17602-30-03
RAPPORT TECHNIQUE

TECHNISCHER BERICHT

July 2021
ICS 49.140

English version

Space product assurance - Human dependability handbook
Assurance produit des projets spatiaux - Guide sur le Raumfahrtproduktsicherung - Handbuch zur
facteur humain menschlichen Zuverlässigkeit


This draft Technical Report is submitted to CEN members for Vote. It has been drawn up by the Technical Committee
CEN/CLC/JTC 5.

CEN and CENELEC members are the national standards bodies and national electrotechnical committees of Austria, Belgium,
Bulgaria, Croatia, Cyprus, Czech Republic, Denmark, Estonia, Finland, France, Germany, Greece, Hungary, Iceland, Ireland, Italy,
Latvia, Lithuania, Luxembourg, Malta, Netherlands, Norway, Poland, Portugal, Republic of North Macedonia, Romania, Serbia,
Slovakia, Slovenia, Spain, Sweden, Switzerland, Turkey and United Kingdom.

Recipients of this draft are invited to submit, with their comments, notification of any relevant patent rights of which they are
aware and to provide supporting documentation.

Warning : This document is not a Technical Report. It is distributed for review and comments. It is subject to change without
notice and shall not be referred to as a Technical Report.





















CEN-CENELEC Management Centre:
Rue de la Science 23, B-1040 Brussels
© 2021 CEN/CENELEC All rights of exploitation in any form and by any means Ref. No. FprCEN/TR 17602-30-03:2021 E
reserved worldwide for CEN national Members and for
CENELEC Members.

---------------------- Page: 3 ----------------------
kSIST-TP FprCEN/TR 17602-30-03:2021
FprCEN/TR 17602-30-03:2021 (E)
Table of contents
European Foreword . 5
Introduction . 6
References . 6
1 Scope and objectives . 8
1.1 Scope . 8
1.2 Objectives . 9
2 References . 10
3 Terms, definitions and abbreviated terms . 11
3.1 Terms from other standards . 11
3.2 Terms specific to the present handbook . 11
3.3 Abbreviated terms. 13
4 Objectives of human dependability . 14
5 Principles of human dependability . 15
5.1 Human dependability concept . 15
5.1.1 Introduction . 15
5.1.2 Failure scenario integrating human errors . 16
5.1.3 Human error and error type . 16
5.1.4 Error precursors and error mitigators. 16
5.2 Human role in the system . 24
5.2.1 Overview . 24
5.2.2 Human contribution to safety and mission success . 24
5.2.3 Fundamental principles driving function allocation. 25
5.2.4 Some principles driving user interfaces design . 26
5.2.5 Automated processes and operator tasks in space systems . 28
5.3 References . 29
6 Human dependability processes . 31
6.1 General . 31
6.2 Human error analysis . 32
6.2.1 Objectives of human error analysis . 32
6.2.2 Principles of human error analysis . 33
2

---------------------- Page: 4 ----------------------
kSIST-TP FprCEN/TR 17602-30-03:2021
FprCEN/TR 17602-30-03:2021 (E)
6.2.3 Human error analysis process . 37
6.3 Human error reporting and investigation . 41
6.3.1 Objectives of human error reporting and investigation . 41
6.3.2 Principles of human error reporting and investigation . 41
6.3.3 Human error reporting and investigation process . 43
6.4 References . 45
7 Implementation of human dependability in system life cycle . 46
7.1 General . 46
7.2 Human dependability activities in project phases . 47
7.2.1 Overview . 47
7.2.2 Phase A: Feasibility . 47
7.2.3 Phase B: Preliminary Definition . 48
7.2.4 Phase C: Detailed Definition . 49
7.2.5 Phase D: Qualification and Production . 50
7.2.6 Phases: E Operations/Utilization and F Disposal . 52
7.3 References . 53
Annex A (informative) Human error analysis data - examples . 54
A.1 Overview . 54
A.2 Examples of the Evolution of PSFs . 55
A.3 Examples of Human Error Scenario Data . 58
A.4 References . 58
Annex B (informative) Human error analysis documentation . 59
Annex C (informative) Human error analysis example questions . 61
C.1 Examples of questions to support a risk analysis on anomalies and human
error during operations . 61
C.2 References . 63
Annex D (informative) Human dependability in various domains . 64
D.1 Human dependability in industrial sectors . 64
D.2 References . 66
Bibliography . 68

Figures
Figure 5-1: Examples of human error in failure scenarios . 16
Figure 5-2: Error precursors, error mitigators and human error in failure scenarios . 17
Figure 5-3: HFACS model . 20
3

---------------------- Page: 5 ----------------------
kSIST-TP FprCEN/TR 17602-30-03:2021
FprCEN/TR 17602-30-03:2021 (E)
Figure 5-4: Levels of human performance . 21
Figure 5-5: Basic error types . 23
Figure 5-6: MABA-MABA principle . 25
Figure 5-7: Small portion of Chernobyl nuclear power plant control room (from
http://www.upandatom.net/Chernobyl.htm) . 26
Figure 5-8: Example of a computer-based, concentrated control room (Large Hadron
Collider at CERN) . 27
Figure 5-9. Example of a computer-based, concentrated user interface – the glass
cockpit (transition to glass cockpit for the Boeing 747) . 27
Figure 6-1: Human error reduction examples . 33
Figure 6-2: Human error analysis and reduction process . 37
Figure 6-3: Human error analysis iteration . 41
Figure 6-4: Human error reporting and investigation process . 44
Figure 7-1: Human dependability in system life cycle . 46

Tables
Table A-1 : SPAR_H PSF modelling considerations for MIDAS [57] . 55
Table B-1 : Example of an “Human Error Analysis Form sheet” . 60
Table D-1 : Examples of Comparable External Domains . 65


4

---------------------- Page: 6 ----------------------
kSIST-TP FprCEN/TR 17602-30-03:2021
FprCEN/TR 17602-30-03:2021 (E)
European Foreword
This document (FprCEN/TR 17602-30-03:2021) has been prepared by Technical Committee
CEN/CLC/JTC 5 “Space”, the secretariat of which is held by DIN.
It is highlighted that this technical report does not contain any requirement but only collection of data
or descriptions and guidelines about how to organize and perform the work in support of EN 16602-
30.
This Technical report (FprCEN/TR 17602-30-03:2021) originates from ECSS-Q-HB-30-03A.
Attention is drawn to the possibility that some of the elements of this document may be the subject of
patent rights. CEN [and/or CENELEC] shall not be held responsible for identifying any or all such
patent rights.
This document has been prepared under a mandate given to CEN by the European Commission and
the European Free Trade Association.
This document has been developed to cover specifically space systems and has therefore precedence
over any TR covering the same scope but with a wider domain of applicability (e.g.: aerospace).

This document is currently submitted to the CEN CONSULTATION.

5

---------------------- Page: 7 ----------------------
kSIST-TP FprCEN/TR 17602-30-03:2021
FprCEN/TR 17602-30-03:2021 (E)
Introduction
Space systems always have “human in the loop” such as spacecraft operators in a control centre, test
or maintenance staff on a ground or astronauts on board.
Human dependability complements disciplines that concern the interaction of the human element
with or within a complex sociotechnical system and its constituents and processes such as human
factors engineering (see ECSS-E-ST-10-11C “Human factors engineering” [1]), human systems
integration [2], human performance capabilities, human-machine interaction and human-computer
interaction in the space domain [3],[4].
Human dependability captures the emerging consensus and nascent effort in the space sector to
systematically include the considerations of “human behaviour and performance” in the design,
validation and operations of both crewed and un-crewed systems to take benefit of human capabilities
and to prevent human errors. Human behaviour and performance can be influenced by various
factors, also called precursors (e.g. performance shaping factors), resulting in human errors, or error
mitigators, limiting the occurrence or impact of human errors. Human errors can originate from
inadequate system design i.e. that ignores or does not properly account for human factor engineering
and system operation. Human errors can contribute to or be part of failure or accident scenarios
leading to undesirable consequences on a space mission such as loss of mission or as worst case loss of
life.
In the space domain, human dependability as a discipline first surfaced during contractor study and
policy work in the early 1990s in the product assurance, system safety and knowledge management
domain [5],[6] and concerned principles and practices to improve the safety and dependability of
space systems by focusing on human error, related design recommendations and root cause analysis
[7],[8].
The standards ECSS-Q-ST-30C “Dependability”[9] and ECSS-Q-ST-40C”Safety” [10] define principles
and requirements to assess and reduce safety and dependability risks and address aspects of human
dependability such as human error failure tolerance and human error analysis to complement FMECA
and hazard analysis. The objective of human error analysis is to identify, assess and reduce human
errors involved failure scenarios and their consequences. Human error analysis can be implemented
through an iterative process, with iterations being determined by the project progress through the
different project phases. Human error analysis is not to be seen as the conclusion of an investigation,
but rather as a starting point to ensure safety and mission operations success.
The main focus of the handbook is on human dependability associated with humans directly involved
in the operations of a space system (“humans” understood here as individual human operator or
astronaut or groups of humans i.e. e.g. a crew, a team or an organization including AIT (assembly,
integration and test)and launch preparation). This includes and concerns especially the activities
related to the planning and implementation of space system control and mission operations from
launch to disposal, and can be extended to cover operations such as AIT and launch preparation.
References
[1] ECSS-E-ST-10-11C - Space engineering - Human factors engineering, 31 July 2008 (Number of EN
version: EN 16603-10-11)
[2] Booher, Harold R. (Ed.) (2003) Handbook of Human Systems Integration. New York: Wiley.
6

---------------------- Page: 8 ----------------------
kSIST-TP FprCEN/TR 17602-30-03:2021
FprCEN/TR 17602-30-03:2021 (E)
[3] NASA (2010) Human Integration Design Handbook NASA/SP-2010-3407 (Baseline).
Washington, D.C.: NASA.
[4] NASA (2011) Space Flight Human-System Standard Vol. 2: Human Factors, Habitability, and
Environmental Health NASA-STD-3001,Vol. 2. Washington, D.C.: NASA.
[5] Atkins, R. K. (1990) Human Dependability Requirements, Scope and Implementation at the
European Space Agency. Proceedings of the Annual Reliability and Maintainability
Symposium, IEEE, pp. 85-89.
[6] Meaker, T. A. (1992) Future role of ESA R&M assurance in space flight operation. Proceedings
of the Annual Reliability and Maintainability Symposium, IEEE, pp. 241-242.
[7] Alenia Spazio (1994) Human Dependability Tools, Techniques and Guidelines: Human Error
Avoidance Design Guidelines and Root Cause Analysis Method (SD-TUN-AI-351, -353, -351).
Noordwijk: ESTEC.
[8] Cojazzi, G. (1993) Root Cause Analysis Methodologies: Selection Criteria and Preliminary
Evaluation, ISEI/IE/2443/93, JRC Ispra, Italy: Institute for System Engineering and Informatics.
[9] ECSS-Q-ST-30 – Space product assurance - Dependability, 6 March 2009 (Number of EN version:
EN 16602-30)
[10] ECSS-Q-ST-40 – Space product assurance - Safety, 6 March 2009 (Number of EN version:
EN 16602-40)
7

---------------------- Page: 9 ----------------------
kSIST-TP FprCEN/TR 17602-30-03:2021
FprCEN/TR 17602-30-03:2021 (E)
1
Scope and objectives
1.1 Scope
The handbook defines the principles and processes of human dependability as integral part of system
safety and dependability. The handbook focuses on human behaviour and performance during the
different operation situations as for example in a control centre such as handover to routine mission
operation, routine mission operation, satellite maintenance or emergency operations.

This handbook illustrates the implementation of human dependability in the system life cycle, where
during any project phase there exists the need to systematically include considerations of the:
 Human element as part of the space system,
 Impact of human behaviour and performance on safety and dependability.
Within this scope, the main application areas of the handbook are to support the:
a. Development and validation of space system design during the different project phases,
b. Development, preparation and implementation of space system operations including their
support such as the organisation, rules, training etc.
c. Collection of human error data and investigation of incidents or accidents involving human error.

The handbook does not address:
 Design errors: The handbook intends to support design (and therefore in this sense, addresses
design errors) regarding the avoidance or mitigation of human errors during operations.
However, human error during design development are not considered.
 Quantitative (e.g. probabilistic) analysis of human behaviour and performance: The handbook
does not address probabilistic assessment of human errors as input to system level safety and
dependability analysis and consideration of probabilistic targets, and
 Intentional malicious acts and security related issues: Dependability and safety deals with
“threats to safety and mission success” in terms of failures and human non malicious errors and
for the sake of completeness includes “threats to safety and mission success” in terms of
malicious actions, which are addressed through security risk analysis. However by definition
“human dependability” as presented in this handbook excludes the consideration of “malicious
actions” and security related issues i.e. considers only “non-malicious actions” of humans.

The handbook does not directly provide information on some disciplines or subjects, which only
indirectly i.e. at the level of PSFs (see section 5) interface with “human dependability”. Therefore the
handbook does not provide direct support to “goals” such as:
 optimize information flux in control room during simulations and critical operations,
 manage cultural differences in a team,
 cope with negative group dynamics,
8

---------------------- Page: 10 ----------------------
kSIST-TP FprCEN/TR 17602-30-03:2021
FprCEN/TR 17602-30-03:2021 (E)
 present best practices and guidelines about team training needs and training methods,
 provide guidelines and best practices concerning planning of shifts,
 present basic theory about team motivation, and
 manage conflict of interests on a project.
1.2 Objectives
The objectives of the handbook are to support:
 Familiarization with human dependability (see section 5 “principles of human dependability”).
For details and further reading see listed “references” at the end of each section of the
handbook.
 Application of human dependability; (see section 6 “human dependability processes“ and 7
“implementation of human dependability in system life cycle“).

9

---------------------- Page: 11 ----------------------
kSIST-TP FprCEN/TR 17602-30-03:2021
FprCEN/TR 17602-30-03:2021 (E)
2
References
Due to the structure of the document, each section includes at its end the references called in it.
The Bibliography at the end of this document contains a list of recommended literature.
10

---------------------- Page: 12 ----------------------
kSIST-TP FprCEN/TR 17602-30-03:2021
FprCEN/TR 17602-30-03:2021 (E)
3
Terms, definitions and abbreviated terms
3.1 Terms from other standards
a. For the purpose of this document, the terms and definitions from ECSS-S-ST-00-01 apply
b. For the purpose of this document, the terms and definitions from ECSS-Q-ST-40 apply, in
particular the following term:
1. operator error
3.2 Terms specific to the present handbook
3.2.1 automation
design and execution of functions by the technical system that can include functions resulting from
the delegation of user’s tasks to the system
3.2.2 error mitigator
set of conditions and circumstances that influences in a positive way human performance and the
occurrence of a human error
NOTE The conditions and circumstances are best described by the
performance shaping factors and levels of human performance.
3.2.3 error precursor
set of conditions and circumstances that influences in a negative way human performance and the
occurrence of a human error
3.2.4 human dependability
performance of the human constituent element and its influencing factors on system safety, reliability,
availability and maintainability
3.2.5 human error
inappropriate or undesirable observable human behaviour with potential impact on safety,
dependability or system performance
NOTE Human behaviour can be decomposed into perception, analysis,
decision and action.
3.2.6 human error analysis
systematic and documented process of identification and assessment of human errors, and analysis
activities supporting the reduction of human errors
11

---------------------- Page: 13 ----------------------
kSIST-TP FprCEN/TR 17602-30-03:2021
FprCEN/TR 17602-30-03:2021 (E)
3.2.7 human error reduction
elimination, or minimisation and control of existing or potential conditions for human errors
NOTE The conditions and circumstances are best described by the
performance shaping factors and levels of human performance.
3.2.8 human error type
classification of human errors into slips, lapses or mistakes
NOTE The types of human error are described in section 6.2 on the human
dependability concept.
3.2.9 level of human performance
categories of human performances resulting from human cognitive, perceptive or motor behaviour in
a given situation
NOTE 1 As an example, categories of human performances can be "skill
based", "rule based" and "knowledge based".
NOTE 2 The level of human performance results from the combination of the
circumstances and current situation (e.g. routine situation, trained
situation, novel situation) and the type of control of the human action
(e.g. consciously or automatically).
3.2.10 operator-centred design
approach to human-machine system design and development that focuses, beyond the other technical
aims, on making systems usable
3.2.11 performance shaping factor
specific error precursor or error mitigator that influences human performance and the likelihood of
occurrence of a human error.
NOTE 1 Performance shaping factors are either error precursors or error
mitigators appearing in a failure scenario and enhance or degrade
human performance.
NOTE 2 Different performances shaping factors are listed in section 5 of this
document.
3.2.12 resilience
ability to anticipate and adapt to the potential for “surprise and error” in complex sociotechnical
systems
NOTE Resilience engineering provides a framework for understanding and
addressing the context of failures i.e. as a symptom of more in-depth
structural problems of a system.
3.2.13 socio-technical system
holistic view of the system including the operators, the organization in which the operator is involved
and the technical system operated.
NOTE A socio-technical system is the whole structure including
administration, politics, economy and cultural ingredients of an
organisation or a project.
12

---------------------- Page: 14 ----------------------
kSIST-TP FprCEN/TR 17602-30-03:2021
FprCEN/TR 17602-30-03:2021 (E)
3.3 Abbreviated terms
For the purpose of this document, the following abbreviated terms apply:
Abbreviation Meaning
AIT assembly, integration and testing
FDIR failure detection, isolation and recovery
FMEA failure mode effect analysis
FMECA failure mode effect and criticality analysis
HET human error type
HFACS Human Factors Analysis and Classification System
HMI human-machine interface
HUDEP human dependability
LHP level of human performance
O&M organizational and management
PIF performance influencing factors
PSF performance shaping factor
RAMS reliability, availability, maintainability and safety
SRK skill, rule, knowledge
VACP visual, auditory, cognitive and psychomotor
13

---------------------- Page: 15 ----------------------
kSIST-TP FprCEN/TR 17602-30-03:2021
FprCEN/TR 17602-30-03:2021 (E)
4
Objectives of human dependability
Objectives of human dependability during the space system life cycle include:
 Definition of the role and involvement of the human in the system, for example support
selection of an automation strategy, enhancement of system resilience due to operator
intervention to prevent incidents or accidents;
 Definition and verification of human dependability requirements on a project such as human
error tolerance as part of overall failure tolerance;
 Assessment of safety and dependability risks of a system from design to operation with respect
to human behaviour and performance, and identification of their positive and negative
contributions to safety and mission success;
 Identification of failure scenarios involving human errors through “human error analysis” as
input to safety and dependability analysis and as basis of a “human error reduction”;
 Definition of human error reduction means to drive the definition and implementation of, for
example system design and operation requirements, specifications, operation concepts,
operation procedures, and operator training requirements;
 Support of the development of operation procedures for normal, emergency and other
conditions with respect to human performance and behaviour;
 Collection and reporting of human error data; and
 Support of the investigation of failure scenarios involving human errors.
14

---------------------- Page: 16 ----------------------
kSIST-TP FprCEN/TR 17602-30-03:2021
FprCEN/TR 17602-30-03:2021 (E)
5
Principles of human dependability
5.1 Human dependability concept
5.1.1 Introduction
Human dependability is based on the following concept. When a human operates a system, human
capabilities (skills and knowledge) are exploited. These capabilities have the potential to mitigate
expected and unexpected undesired system behaviour, however they can also introduce human errors
causing or contributing to failure scenarios of the system.
In a first step, for safety and dependability analysis at functional analysis level there is no need to
discriminate how functions are implemented i.e. using hardware, software or humans. Indeed
functional failures (loss or degradation of functions) are identified and the associated consequences
with their consequence severities are determined.
However, in further steps, at lower level of safety and dependability analysis such as FMECA on the
physical design the fact that the function is implemented by hardware or software is considered (see
section 6). Similarly, when operations are analysed, the interaction of the human with the rest of the
system is investigated.
Technical failures and human errors are inevitable in complex systems. The human performance and
the technical system can be seen as functioning as a joint cognitive system [11] with three human
relationships:
 Human - environment (technical system),
 Human - hu
...

Questions, Comments and Discussion

Ask us and Technical Secretary will try to provide an answer. You can facilitate discussion about the standard in here.