ETSI GS ISI 002 V1.1.1 (2013-04)
Information Security Indicators (ISI); Event Model A security event classification model and taxonomy
Information Security Indicators (ISI); Event Model A security event classification model and taxonomy
DGS/ISI-002
General Information
Standards Content (Sample)
Group Specification
Information Security Indicators (ISI);
Event Model
A security event classification model and taxonomy
Disclaimer
This document has been produced and approved by the Information Security Indicators (ISI) ETSI Industry Specification Group
(ISG) and represents the views of those members who participated in this ISG.
It does not necessarily represent the views of the entire ETSI membership.
2 ETSI GS ISI 002 V1.1.1 (2013-04)
Reference
DGS/ISI-002
Keywords
ICT, security
ETSI
650 Route des Lucioles
F-06921 Sophia Antipolis Cedex - FRANCE
Tel.: +33 4 92 94 42 00 Fax: +33 4 93 65 47 16
Siret N° 348 623 562 00017 - NAF 742 C
Association à but non lucratif enregistrée à la
Sous-Préfecture de Grasse (06) N° 7803/88
Important notice
Individual copies of the present document can be downloaded from:
http://www.etsi.org
The present document may be made available in more than one electronic version or in print. In any case of existing or
perceived difference in contents between such versions, the reference version is the Portable Document Format (PDF).
In case of dispute, the reference shall be the printing on ETSI printers of the PDF version kept on a specific network drive
within ETSI Secretariat.
Users of the present document should be aware that the document may be subject to revision or change of status.
Information on the current status of this and other ETSI documents is available at
http://portal.etsi.org/tb/status/status.asp
If you find errors in the present document, please send your comment to one of the following services:
http://portal.etsi.org/chaircor/ETSI_support.asp
Copyright Notification
No part may be reproduced except as authorized by written permission.
The copyright and the foregoing restriction extend to reproduction in all media.
© European Telecommunications Standards Institute 2013.
All rights reserved.
TM TM TM
DECT , PLUGTESTS , UMTS and the ETSI logo are Trade Marks of ETSI registered for the benefit of its Members.
TM
3GPP and LTE™ are Trade Marks of ETSI registered for the benefit of its Members and
of the 3GPP Organizational Partners.
GSM® and the GSM logo are Trade Marks registered and owned by the GSM Association.
ETSI
3 ETSI GS ISI 002 V1.1.1 (2013-04)
Contents
Intellectual Property Rights . 5
Foreword . 5
Introduction . 6
1 Scope . 8
2 References . 8
2.1 Normative references . 8
2.2 Informative references . 8
3 Definitions and abbreviations . 9
3.1 Definitions . 9
3.2 Abbreviations . 13
4 Positioning of the proposed event classification model . 15
4.1 Relationship with the ISO 27004 standard . 15
4.2 The critical importance of positioning the model appropriately . 15
4.3 The necessity for the model to rest on a detailed taxonomy . 17
4.4 Description of the taxonomy . 17
4.5 Complex security incidents versus basic security incidents . 19
4.6 The key drivers underlying the representation proposed . 20
4.7 The general description of the representation . 20
4.8 Link between the event model representation and the list of indicators (and related families) . 21
5 Comparison with other event classification models . 21
5.1 Risk analysis methods classifications . 22
5.2 CAPEC classification . 22
5.3 FIRST classifications . 23
6 Detailed description of the proposed representation of the different categories and sub-categories . 23
6.1 Intrusions and external attacks (Category IEX). 23
6.2 Malfunctions (Category IMF) . 25
6.3 Deviant internal behaviours (Category IDB) . 27
6.4 Behavioural vulnerabilities (Category VBH) . 29
6.5 Software vulnerabilities (Category VSW) . 31
6.6 Configuration vulnerabilities (Category VCF) . 32
6.7 General security (technical & organizational) vulnerabilities (Category VTC and Category VOR) . 33
7 Practical uses of the event classification model . 34
7.1 The classification model pivotal role. 35
7.2 The objective shared with operational risks . 36
7.3 The link with existing studies on cybercrime motivation (threat intelligence) . 36
7.4 Other uses of the classification model . 39
Annex A (informative): Overview of the ISO 27004 standard measurement model . 41
Annex B (informative): Field dictionary for the taxonomy . 42
B.1 Incidents . 42
B.1.1 Who and/or Why . 42
B.1.1.1 Accident . 42
B.1.1.2 Unwitting or unintentional act (error) . 42
B.1.1.3 Unawareness or carelessness or irresponsibility . 43
B.1.1.4 Malicious act . 43
B.1.2 What . 44
B.1.2.1 Unauthorized access to a system and/or to information . 44
B.1.2.2 Unauthorized action on the information system and/or against the organization . 45
B.1.2.3 Installation of unauthorized software programs (malware) on a system (without the owner's consent) . 46
B.1.2.4 Information system remote disturbance . 46
B.1.2.5 Social engineering attacks . 46
B.1.2.6 Personal attack on organization's personnel or organization disturbance . 47
ETSI
4 ETSI GS ISI 002 V1.1.1 (2013-04)
B.1.2.7 Physical intrusion or illicit action . 47
B.1.2.8 Illicit activity carried out on the public Internet (harming an organization) . 47
B.1.2.9 Various errors (administration, handling, programming, general use) . 48
B.1.2.10 Breakdown or malfunction . 49
B.1.2.11 Environmental events (unavailability caused by a natural disaster) . 49
B.1.3 How . 49
B.1.3.1 Unauthorized access to a system and/or to information . 49
B.1.3.2 Unauthorized action on the information system and/or against the organization . 50
B.1.3.3 Installation of unauthorized software programs (malware) on a system (without the owner's consent) . 52
B.1.3.4 Information system remote disturbance . 52
B.1.3.5 Social engineering attacks . 53
B.1.3.6 Personal attack on organization's personnel or organization disturbance . 53
B.1.3.7 Physical intrusion or illicit action . 53
B.1.3.8 Illicit activity carried out on the public Internet network (harming an organization) . 54
B.1.3.9 Various errors (administration, handling, programming, general use) . 54
B.1.3.10 Breakdown or malfunction . 54
B.1.3.11 Environmental events (unavailability caused by a natural disaster) . 54
B.1.4 Status . 55
B.1.4.1 Security event attempt (or occurrence) underway . 55
B.1.4.2 Succeeded (or performed) security event . 55
B.1.4.3 Failed security event . 55
B.1.5 With what vulnerability(ies) exploited (up to 3 combined kinds of vulnerabilities) . 55
B.1.5.1 Behavioural vulnerability . 55
B.1.5.2 Software vulnerability. 55
B.1.5.3 Configuration vulnerability . 55
B.1.5.4 General security vulnerability . 55
B.1.5.5 Conception vulnerability . 55
B.1.5.6 Material vulnerability . 55
B.1.6 On what kind of asset . 56
B.1.6.1 Data bases and applications . 56
B.1.6.2 Systems . 56
B.1.6.3 Networks and telecommunications . 58
B.1.6.4 Offline storage devices . 59
B.1.6.5 End-user devices . 59
B.1.6.6 People . 60
B.1.6.7 Facilities and environment . 61
B.1.7 With what CIA consequences . 61
B.1.7.1 Loss of confidentiality (with types of loss and with the amount of data as a possible complement) . 61
B.1.7.2 Loss of integrity (with types of loss) . 63
B.1.7.3 Loss of availability (with types of loss and with the duration as a possible complement) . 63
B.1.8 With what kind of impact . 63
B.1.8.1 Direct impact . 63
B.1.8.2 Indirect impact . 64
B.2 Vulnerabilities . 64
B.2.1 What . 64
B.2.1.1 Behavioural vulnerabilities . 64
B.2.1.2 Software vulnerabilities . 67
B.2.1.3 Configuration vulnerabilities . 67
B.2.1.4 General security (organizational) vulnerabilities . 68
B.2.1.5 Conception vulnerability . 70
B.2.1.6 Material vulnerability . 70
B.2.2 On what kind of assets . 71
B.2.3 Who (only for behavioural vulnerabilities) . 71
B.2.4 For what purpose (only for behavioural vulnerabilities) . 71
B.2.5 To what kind of possible exploitation . 71
Annex C (informative): Authors & contributors . 72
Annex D (informative): Bibliography . 73
History . 75
ETSI
5 ETSI GS ISI 002 V1.1.1 (2013-04)
Intellectual Property Rights
IPRs essential or potentially essential to the present document may have been declared to ETSI. The information
pertaining to these essential IPRs, if any, is publicly available for ETSI members and non-members, and can be found
in ETSI SR 000 314: "Intellectual Property Rights (IPRs); Essential, or potentially Essential, IPRs notified to ETSI in
respect of ETSI standards", which is available from the ETSI Secretariat. Latest updates are available on the ETSI Web
server (http://ipr.etsi.org).
Pursuant to the ETSI IPR Policy, no investigation, including IPR searches, has been carried out by ETSI. No guarantee
can be given as to the existence of other IPRs not referenced in ETSI SR 000 314 (or the updates on the ETSI Web
server) which are, or may be, or may become, essential to the present document.
Foreword
This Group Specification (GS) has been produced by ETSI Industry Specification Group (ISG) Information Security
Indicators (ISI).
The present document is included in a series of 6 ISI 00x specifications. These 6 specifications are the following (see
figure 1 summarizing the various concepts involved in event detection and interactions between all parts):
• GS ISI 001-1 [3] addressing (together with its associated guide GS ISI 001-2 [4]) information security
indicators, meant to measure application and effectiveness of preventative measures.
• The present document (GS ISI 002) addressing the underlying event classification model and the associated
taxonomy.
• GS ISI 003 [i.5] addressing the key issue of assessing organization's maturity level regarding overall event
detection (technology/process/ people) in order to weigh event detection results.
• GS ISI 004 [i.6] addressing demonstration through examples how to produce indicators and how to detect the
related events with various means and methods (with a classification of the main categories of use
cases/symptoms).
• GS ISI 005 [i.7] addressing ways to produce security events and to test the effectiveness of existing detection
means within organization (for major types of events), which is a more detailed and a more case by case
approach than GS ISI 003 one [i.5] and which can therefore complement it.
ETSI
6 ETSI GS ISI 002 V1.1.1 (2013-04)
GS ISG ISI Series Summary Definition
Event
reaction
measures
Fake events
(Simulation)
Security Event
Real Detected
prevention
detection
events events
measures
measures
Residual risk
(event model-
centric vision)
Figure 1: Positioning the 5 GS ISI 00x against the 3 main security measures
Introduction
An overall corporate Cyber Defence and SIEM approach is intended to implement continuous security improvement
within large organizations, with the main following goals:
• operationally and constantly reduce the residual risk incurred by their Information Systems (see chart below,
which highlights the 2 associated types of events – incidents and vulnerabilities –, and the joint area covered
by IT security policy through the concept of usage or implementation drift); and
• to assess the actual application and real effectiveness of their security policies (or of their ISMS, if they have
one), for the purpose of their constant improvement.
Such an approach, which to a large extent relies on using the traces available in the Information System's various
components, is necessarily organized around an "event-model centric" vision, and can also be tied up to the PDCA
model that is commonly used in quality and security areas. As such, this primarily involves implementing this model's
PDCA "Check" step on the basis of very detailed knowledge of threats and vulnerabilities.
ETSI
7 ETSI GS ISI 002 V1.1.1 (2013-04)
The 3 kinds of residual risk
Area covered by
IT Security policy
Known Known
Unknown
Actually
residual
residual
residual
covered
risk
risk
risk
risk
(accepted)
(suffered)
Incidents exploi-
Incidents exploiting Incidents not
ting unknown
known but uncovered possible
Incidents exploiting
vulnerabilities
known vulnerabilities or not tackled vulne- (vulnerabilities
rabilities due to usage remediated by
or implementation drift operational or
not practices)
Figure 2: The 3 kinds of residual risks
The recent worldwide trends in ICT security show that significant progress can be accomplished within a few years
with the deployment of an organization-wide operational Cyber Defence approach and SIEM approach. Also, a recent
survey by a major consulting firm of 15 major companies and organizations brings to light 9 key success criteria. It is
worth mentioning the two most important criteria:
• The reliance of the Cyber Defence and SIEM approach on a security event classification model that takes into
account both incidents and vulnerabilities, and that notably stresses particular attention to malicious and
intentional acts, the monitored events themselves being selected on the basis of main relevant CIA risks and
associated metrics (e.g. statistics).
• Training with this model for the relevant people using the Information System, with particular attention to the
presentation of concrete examples of disasters associated with inventoried security event main types.
As such, the present document's objective is to build a full taxonomy to thoroughly describe all IT security events (and
when appropriate and necessary non-IT security events) and, based on it, to present an original representation that
leverages the current international best practices and enables diversified and complex uses. The choice of a detailed
taxonomy, which describes security events through a set of attributes (different for incidents and vulnerabilities),
ensures that all possible situations can be taken into account with the required flexibility (especially thanks to the
provided open dictionary), while the representation chosen for the taxonomy, highlighting the main categories generally
accepted by industry consensus, makes the event classification model easier to understand and embrace for
stakeholders. And this representation is key because acceptance by all will depend mainly on it.
It should be finally mentioned that the present document rests in part on work carried out by Club R2GS (see Annex D
– Bibliography), a French association created by the end of 2008, specializing in Cyber Defence and Security
Information and Event Management (SIEM) and gathering most very large companies and organizations (mainly users)
amongst the most advanced ones in France regarding these topics. The present document (GS ISI 002), as well as all
other GS ISI 00x, is therefore based on a strong experience, this community of users having adopted and used the
event classification model and the related reference framework for indicators for sometimes over 3 years, and in some
cases on a world-wide scale.
ETSI
8 ETSI GS ISI 002 V1.1.1 (2013-04)
1 Scope
The present document provides a comprehensive security event classification model and the associated taxonomy
(based on already existing results and hands-on user experience), covering both security incidents and vulnerabilities.
These 2 latter ones become nonconformities when they violate organization's security policy. The present document
(GS ISI 002) is meant mainly to support operational security staff in their effort to qualify and categorize the detected
security events, and more generally to all stakeholders (especially CISOs and IT security managers) in their needs to
establish a common language.
2 References
References are either specific (identified by date of publication and/or edition number or version number) or
non-specific. For specific references, only the cited version applies. For non-specific references, the latest version of the
referenced document (including any amendments) applies.
Referenced documents which are not found to be publicly available in the expected location might be found at
http://docbox.etsi.org/Reference.
NOTE: While all hyperlinks included in this clause were valid at the time of publication, ETSI cannot guarantee
their long term validity.
2.1 Normative references
The following referenced documents are necessary for the application of the present document.
[1] NIST SP 800-126 Revision 2 (September 2011) : "The Technical Specification for the Security
Content Automation Protocol (SCAP Version 1.2)".
[2] MITRE CCE List Version 5.20120314 (March 2012): "Common Configuration Enumeration".
[3] ETSI GS ISI 001-1: "Information Security Indicators (ISI); Indicators (INC); Part 1: A full set of
operational indicators for organizations to use to benchmark their security posture".
[4] ETSI GS ISI 001-2: "Information Security Indicators (ISI); Indicators (INC); Part 2: Guide to
select operational indicators based on the full set given in part 1".
[5] ISO/IEC 27000:2012: "Information technology -- Security techniques -- Information security
management systems -- Overview and vocabulary".
2.2 Informative references
The following referenced documents are not necessary for the application of the present document but they assist the
organization with regard to a particular subject area.
[i.1] ISO 27002:2005: "Information technology - Security techniques - Code of practice for information
security management".
[i.2] ISO 27004:2009: "Information technology - Security techniques - Information security
management - Measurement".
[i.3] ISO 27005:2011: "Information technology - Security techniques - Information security risk
management".
[i.4] FIRST Classification (November 2004): "CSIRT Case Classification (Example for enterprise
CSIRT)".
[i.5] ETSI GS ISI 003: "Information Security Indicators (ISI); Indicators; Part 3: A set of Key
Performance Security Indicators (KPSI) for security event detection maturity evaluation".
ETSI
9 ETSI GS ISI 002 V1.1.1 (2013-04)
[i.6] ETSI GS ISI 004: "Information Security Indicators (ISI); Event Detection; Part 4: Guidelines for
event detection implementation".
[i.7] ETSI GS ISI 005: "Information Security Indicators (ISI); Event Testing; Part 5: Event Testing".
3 Definitions and abbreviations
3.1 Definitions
For the purposes of the present document, the following terms and definitions apply (ISO 27000 [5] compliant where
applicable):
NOTE: See also the summary chart at the end of this list.
asset: information asset that has value to the organization and that can be broken down in primary assets (such as
business activities, data, application software, etc. which hold the business value) and secondary/supporting assets
(network or system infrastructure, which host primary assets)
assurance: refers to the planned and systematic activities implemented in a management system so that management
requirements for a service will be fulfilled
NOTE: It is the systematic measurement, comparison with a standard, monitoring of processes and an associated
feedback loop that confers error prevention. This can be contrasted with Management "Control", which is
focused on process outputs.
base measure: regarding the "indicator" issue, a base measure is defined in terms of an attribute and the specified
measurement method for quantifying it (e.g. number of trained personnel, number of sites, cumulative cost to date)
NOTE: As data is collected, a value is assigned to a base measure.
continuous auditing: periodic verification and collection of a series of controls identified within the Information
System, corresponding with the detection of incidents and of software, configuration, behavioural or global security
framework vulnerabilities and/or non-conformities
NOTE: There are three checking levels (in principle, hierarchy notably implemented within banking and financial
institutions):
� Detailed behavioural, global security framework or technical checking at the security software or
equipment level (network, system, application software).
� Level 1 checking via monitoring of trends and deviations of a series of significant measurement
points.
� Level 2 checking (verification of existence of a satisfactory assurance and coverage level of the
chosen control and measurement points, and of implementation of regulatory requirements).
Continuous auditing can be either manual or automatic (for example, monitoring by means of tools
appropriate for a SIEM approach). Finally, continuous auditing is generally associated with statistical
indicators (levels of application and effectiveness of security controls), that provide information
regarding the coverage and assurance level of the security controls in question.
criticality level (of a security event): level defined according to the criteria which measures its potential impact
(financial or legal) on the company assets and information, and which make it possible to evaluate the appropriate level
of reaction to the event (incident treatment or vulnerability or nonconformity removal)
NOTE: The criticality level of a given event is determined by the combination of its severity level (inherent to the
event itself – see definition below) and of the sensitiveness of the target attacked or concerned (linked to
the asset estimated value for the company – whose value concerns confidentiality, integrity or
availability). This concept of criticality level (usually defined on a scale of four levels) is at the core of
any SIEM approach, for which classifying security events processing according to organization-defined
priorities is vital from both a security and economic point of view.
ETSI
10 ETSI GS ISI 002 V1.1.1 (2013-04)
derived measure: regarding the "indicator" issue, a measure that is derived as a function of two or more base measures
effectiveness (of security policy or of ISMS): as a supplement to the actual application of security policy (or of ISMS)
and of its measures assessment, it is necessary to assess its level of effectiveness, that can be estimated through
identified residual risk (that corresponds with the residual vulnerabilities that are actually exploited and that have led to
security incidents)
NOTE: It should be added that the term "efficiency" is sometimes also used, but generally with a different
meaning of economy in the use of resources (not addressed here for reasons of lesser relevancy).
(security) incident: single unwanted or unexpected security event, or series thereof, that correspond to the exploitation
of an existing vulnerability (or attempt to), and with an actual or potential threat (attempt underway), that have a
significant probability of compromising business operations and threatening information security
NOTE: In case of success, an incident affects nominal operations of all or part of an information system
(according to the Confidentiality, Integrity and Availability criteria - English acronym CIA). An incident
that manifests itself through previously unseen phenomena, or is built as a complex combination of
elementary incidents often cannot be qualified and therefore inventoried or categorized easily; such an
incident will often be referred to as an anomaly.
indicator: measure that provides an estimate or evaluation of specified attributes derived from an analytical model with
respect to a defined information need
NOTE: Indicators are the basis for analysis and decision making.
log: continuous recording of software usage computer data, with specific characteristics: detailed and specified
structure, time-stamping, recording as soon as they occurs in files or other media
NOTE: Logs are a kind of trace (more general concept – see definition below.
non-conformity: security event that indicates that organization's required security rules and regulations have not been
properly enforced, and are therefore the consequence of a usage or implementation drift
NOTE: Continuous monitoring of non-conformities (similar to continuous auditing - Cf. this term above) enables
to better ensure that an organization's security policy is being enforced. Non-conformity can be further
refined according to their kind: configuration, behaviour, global security (technical and organizational)
and material. Non-conformities are also vulnerabilities or incidents, depending on the situation (see
definition).
periodic audit (Periodic scanning): using isolated audit means, periodic acquisition and verification of security
controls
NOTE: A periodic audit can also be either manual or automatic (for example, carried out through scanner type
tools). Finally, a periodic audit is generally Boolean (all or nothing compliance level).
risk: product of the probability of occurrence of a security incident involving a specific asset by its impact on this asset
(impact assessed according to the CIA sensitivity level)
NOTE: The level of risk exposure (concept which is used in risk assessment methods) corresponds to the product
of the vulnerability level of the asset in question by the threat level hanging over it.
risk not covered (by existing security measures): risk sometimes also referred to as "residual", which breaks down
into 3 shares:
• Known and realized suffered risk, corresponding to the impact suffered by the organization under attack when
the security policy is not applied (configuration, behavioural or global security non-conformities), and when
known and critical software vulnerabilities are not appropriately addressed
• Known and accepted risk that corresponds to a risk taken by choice by an organization, by comparing the risk
associated with attacks with economic, usage and security level considerations
• Unknown risk associated with unknown and unpatched vulnerabilities, or innovative attack vectors
ETSI
11 ETSI GS ISI 002 V1.1.1 (2013-04)
security event: information about a change of state in a system that may be security relevant and that indicates the
appearance of a risk for the organization
NOTE: A security event is either an incident or a vulnerability occurrence or detection (see definition of these
terms). 500 security events have been inventoried within the industry, and are grouped into 9 different
major categories, with the 3 first corresponding to incidents, and the 4 last to vulnerabilities: external
attacks and intrusions, malfunctions, internal deviant behaviours, behavioural vulnerabilities, software
vulnerabilities, configuration vulnerabilities, general security (technical or organizational) vulnerabilities.
severity level (of security incident): level (generally defined on a 4-element scale) inherent to the event itself and that
depends on several criteria that vary according to the types of events (in decreasing order of importance):
• Dangerousness is the result of multiple factors with variable combinations according to circumstances or types
of incidents: propagation speed for a worm, virulence, effectiveness, importance and number of impacted
assets, capability of harm, target reachability, capability of remote action, persistence, weakness or lack of
curative means, and extend of compromise (depth of component which is can be or has been reached, concept
of Defence in Depth or DiD)
• Stealthiness covers the level to which the incident can be hidden to the defender: obvious visibility, visible
through simple and easy to use mechanisms, detection requires advanced technical tools, almost invisibility. It
is a key factor for monitoring and detection. Anonymization and camouflage, or active and passive masking
techniques are stealthiness techniques. Stealthiness takes on an indirect meaning when it applies to similar not
yet detected incidents
• Feasibility relates to the attacker's motivation and skills. It increases proportionally to all the necessary
prerequisites (regarding skills, tools, financial means, collusion, initial access, etc.) combined with the
presence of exploitable vulnerabilities; feasibility can be tied often to the frequency of attacks that can be
detected in the world. Its assessment is not simple, because it is subject to change. For example, it may be
difficult to create a hacking tool for a given vulnerability. However, once the tool is released on the Internet, it
can be used by unskilled attackers. Feasibility takes on an indirect meaning when it applies to a potential threat
(see definition of this term), as the analysis of its factors required to evaluate it provides an interesting
evaluation of the risk
NOTE: This notion appeared in the mid-1990s within the framework of the ITSEC certification, then towards the
end of this decade with the issue of global and public management of vulnerabilities and "malware"
(security software vendors and CERTs). It is once again being developed at the present time with the
recent release of log analysis and correlation tools that completely integrate this concept along with
criticality.
severity level (of vulnerability or of nonconformity): the severity level definition is about the same as the one for
incidents, with a few small differences:
• Dangerousness: impact of the related attacks, weakness of protective techniques, possible remote exploitation,
scope of the target / victim population (number of machines, of services, …), importance to organization of the
security rule that was violated
• Stealthiness: same definition as for incident
• Exploitability (by attackers), is the opposite definition of incident feasibility
NOTE: The proposed definition is in line with the CVSS (NIST 800-126 [1] or SCAP) standard for software
vulnerabilities.
security policy: overall intention and requirements as formally expressed by management
NOTE: Two levels are used: general statement and detailed rules. Rules apply to network and systems
configuration, user interaction with systems and applications, and detailed processes and procedures
(governance, operational teams, and audit). Violation of a rule brings about nonconformity, which is
either an incident or vulnerability.
sensitivity level: level which corresponds to the potential impact (financial, legal or brand image) of a security event on
an asset, an impact linked to the estimated value of the asset for the company along four possible viewpoints: its
Confidentiality, Integrity and Availability (CIA) and sometimes its accountability
ETSI
12 ETSI GS ISI 002 V1.1.1 (2013-04)
SIEM (Security Information and Event Management): SIEM solutions are a combination of the formerly disparate
product categories of SIM (security information management) and SEM (security event management). SEM deals with
real-time monitoring, correlation of events, notifications and console views. SIM provides long-term storage, analysis
and reporting of log data
NOTE: The present document extends these two notions under the generic SIEM acronym, which encompasses
all organizational, processes and human aspects necessary to deploy and operate these tools, and which
include vulnerability and nonconformity management; we may refer to Cyber Defence approaches in the
most complex case.
taxonomy: science of identifying and naming species, and arranging them into a classification
NOTE: The field of taxonomy, sometimes referred to as "biological taxonomy", revolves around the description
and use of taxonomic units, known as taxa (singular taxon). A resulting taxonomy is a particular
classification ("the taxonomy of ."), arranged in a hierarchical structure or classification scheme.
threat: potential cause of an unwanted incident, which may result in harm to a system or organization
NOTE: There are 4 categories of threats:
� Natural threats:
- Environmental causes: public service outage, fire, and other disasters
- System failure: physical or software computer or network breakdowns
� Human threats:
- Unintentional (error, carelessness, irresponsibility, unawareness, etc.): conception and design,
development, operation and usage, due to chance, hasty development and deployment,
tiredness, gullibility, incompetence
- Internal or external malice: theft, economic spying, sabotage, intrusion, fraud, etc.
The frontier between error, carelessness and malice is often fuzzy: it is always possible for an
unscrupulous employee to plead error even though he has been negligent or malicious. However the
difference between unintentional and malicious actions can often be found with the following clues:
� An unintentional action is not hidden (so not stealthy), it tends to impact availability rather than
confidentiality and integrity, and it has a low dangerousness and a high feasibility. The resulting
severity is often low to fairly low.
� A malicious action is stealthier (notably to enable the attacker to remain anonymous and allow him
to sustain the advantages obtained for a longer period of time), with an impact on confidentiality
and integrity rather than on availability, and with high dangerousness.
trace: computer data that proves the existence of a business operation
NOTE: As an example, logs (see definition elsewhere) are traces, but traces are not necessarily logs.
vulnerability: undesirable state of a system whose occurrence or detection is a security ev
...








Questions, Comments and Discussion
Ask us and Technical Secretary will try to provide an answer. You can facilitate discussion about the standard in here.
Loading comments...