prEN 18286
(Main)Artificial intelligence - Quality management system for EU AI Act regulatory purposes
Artificial intelligence - Quality management system for EU AI Act regulatory purposes
This document specifies the requirements and provides guidance for the definition, implementation, maintenance and improvement of a quality management system for organizations that provide AI systems.
This document is intended to support the organization in meeting applicable regulatory requirements.
Künstliche Intelligenz - Qualitätsmanagementsystem für EU AI Act Regulierungszwecke
Intelligence artificielle - Système de management de la qualité pour le règlement européen sur l'IA
Umetna inteligenca - Sistem vodenja kakovosti za namene regulativnih zahtev Akta EU o UI
General Information
Standards Content (Sample)
SLOVENSKI STANDARD
01-januar-2026
Umetna inteligenca - Sistem vodenja kakovosti za namene regulativnih zahtev
Akta EU o UI
Artificial intelligence - Quality management system for EU AI Act regulatory purposes
Künstliche Intelligenz - Qualitätsmanagementsystem für EU AI Act Regulierungszwecke
Intelligence artificielle - Système de gestion de la qualité à des fins réglementaires dans
le cadre de la loi européenne sur l'IA
Ta slovenski standard je istoveten z: prEN 18286
ICS:
03.100.70 Sistemi vodenja Management systems
03.120.10 Vodenje in zagotavljanje Quality management and
kakovosti quality assurance
35.240.01 Uporabniške rešitve Application of information
informacijske tehnike in technology in general
tehnologije na splošno
2003-01.Slovenski inštitut za standardizacijo. Razmnoževanje celote ali delov tega standarda ni dovoljeno.
EUROPEAN STANDARD DRAFT
NORME EUROPÉENNE
EUROPÄISCHE NORM
October 2025
ICS 03.100.70; 35.240.01
English version
Artificial intelligence - Quality management system for EU
AI Act regulatory purposes
Intelligence artificielle - Système de gestion de la Künstliche Intelligenz - Qualitätsmanagementsystem
qualité à des fins réglementaires dans le cadre de la loi für EU AI Act Regulierungszwecke
européenne sur l'IA
This draft European Standard is submitted to CEN members for enquiry. It has been drawn up by the Technical Committee
CEN/CLC/JTC 21.
If this draft becomes a European Standard, CEN and CENELEC members are bound to comply with the CEN/CENELEC Internal
Regulations which stipulate the conditions for giving this European Standard the status of a national standard without any
alteration.
This draft European Standard was established by CEN and CENELEC in three official versions (English, French, German). A
version in any other language made by translation under the responsibility of a CEN and CENELEC member into its own language
and notified to the CEN-CENELEC Management Centre has the same status as the official versions.
CEN and CENELEC members are the national standards bodies and national electrotechnical committees of Austria, Belgium,
Bulgaria, Croatia, Cyprus, Czech Republic, Denmark, Estonia, Finland, France, Germany, Greece, Hungary, Iceland, Ireland, Italy,
Latvia, Lithuania, Luxembourg, Malta, Netherlands, Norway, Poland, Portugal, Republic of North Macedonia, Romania, Serbia,
Slovakia, Slovenia, Spain, Sweden, Switzerland, Türkiye and United Kingdom.
Recipients of this draft are invited to submit, with their comments, notification of any relevant patent rights of which they are
aware and to provide supporting documentation.Recipients of this draft are invited to submit, with their comments, notification
of any relevant patent rights of which they are aware and to provide supporting documentation.
Warning : This document is not a European Standard. It is distributed for review and comments. It is subject to change without
notice and shall not be referred to as a European Standard.
Contents Page
European foreword . 4
Introduction . 5
1 Scope . 7
2 Normative references . 7
3 Terms and definitions . 7
3.1 Terms relating to management systems . 7
3.2 Terms relating to the AI Act . 13
3.3 Terms relating to AI systems . 14
3.4 Terms related to risk management . 16
4 Quality management system . 17
4.1 General. 17
4.2 Identifying regulatory requirements . 17
4.3 Determining the scope of the quality management system . 18
4.4 Strategy for regulatory compliance . 18
4.5 Documented information . 19
5 Management responsibility . 21
5.1 General. 21
5.2 Quality policy . 22
5.3 Roles, responsibility, and authorities . 22
6 Planning . 23
6.1 Actions to address risks related to the functioning of the quality management system
................................................................................................................................................................... 23
6.2 Quality objectives and planning to achieve them . 24
7 Support . 24
7.1 Resources . 24
7.2 Competence . 25
7.3 Communication . 26
8 Product realization . 27
8.1 Actions to address risks. 27
8.2 Determining the stages of the life cycle . 28
8.3 Inception, design and development . 30
8.4 Verification and validation . 32
8.5 Data management . 33
8.6 Environmental sustainability . 33
8.7 Product documentation . 34
9 Operation and control . 34
9.1 Deployment, operation and monitoring . 34
9.2 Supply chain . 35
9.3 Changes to AI systems . 37
9.4 Post-market monitoring . 39
9.5 Reporting serious incidents . 41
9.6 Nonconformities . 42
10 Performance evaluation . 43
10.1 General . 43
10.2 Review . 43
10.3 Improvement. 45
10.4 Planning of changes . 45
Annex A (informative) Consultation with interested parties regarding fundamental rights
................................................................................................................................................................... 46
A.1 General . 46
A.2 Estimating risk and impact on affected persons . 47
Annex B (informative) Relationship between this document and other harmonized
standards . 48
B.1 Introduction . 48
B.2 Selection of technical specifications . 48
B.3 Harmonized standard interactions . 49
B.4 Supporting harmonized standards . 50
Annex C (informative) Correspondence between this document and ISO 9001:2015 . 51
Annex D (informative) Correspondence between this document and ISO/IEC 42001:2023 52
Annex ZA (informative) Relationship between this European Standard and the essential
requirements of Regulation (EU) 2024/1689 aimed to be covered . 53
Bibliography . 55
European foreword
This document (prEN 18286:2025) has been prepared by Technical Committee CEN/CLC/JTC 21
“Artificial Intelligence”, the secretariat of which is held by DS.
This document is currently submitted to the CEN Enquiry.
This document has been prepared under a standardization request addressed to CEN by the European
Commission. The Standing Committee of the EFTA States subsequently approves these requests for its
Member States.
For the relationship with EU Legislation, see informative Annex ZA, which is an integral part of this
document.
Introduction
0.1 General
The EU’s Artificial Intelligence (AI) Act [1] regulates AI systems through the product safety system
established under the New Legislative Framework. An AI system subject to the EU AI Act can be a product
or a component of a product.
AI systems must be in compliance with applicable regulatory requirements at the moment that the AI
system is placed on the market or put into service. An AI system is placed on the market when it is supplied
for distribution or use on the Union market in the course of a commercial activity, whether in return for
payment or free of charge. An AI system is put into service when it is supplied for the first use directly to
the deployer or for own use in the Union for its intended purpose.
EXAMPLE 1 An in-house developed AI system is deployed for internal use.
EXAMPLE 2 An AI system is placed on the market when it is offered for sale on a website.
A quality management system, while being implemented by a provider, can be directly associated with
one or more AI systems that are intended to be put into service or placed on the market. Quality, in this
context can be understood as compliance with all of the regulatory requirements of the EU AI Act that
apply to providers.
Depending on the context of the AI system, the provider can be required to show conformity with the
industry specific quality management system requirements under sector specific legislation. This
document does not require the provider to maintain a separate quality management system, but can be
used as a complementary to existing requirements depending on the applicable regulatory requirements
for the AI system.
EXAMPLE 3 Medical devices are commonly compliant to ISO 13485 quality management system requirements.
Incorporating the requirements of this document within the existing processes is desirable when achieving
conformity with this document.
This document specifies requirements for a quality management system that complies with applicable
regulatory requirements (as described in Annex ZA) throughout the entire life cycle of the AI system. These
requirements apply to a broad range of AI systems, and include explicit requirements to address risks to
health, safety and fundamental rights which can arise.
This document is intended for use by providers that provide AI systems irrespective of size, nature or
location. The requirements and guidance in this document are, however, specifically tailored to support
providers that operate inside of the European Union, and those located outside of the Union who are active
in the European Union market or who intend to enter that market.
The quality management system in this document is described in a way that the implementation can take
into account the size of the provider, while providing the degree of rigour and level of protection required
by applicable regulatory requirements.
Annex A describes procedures for consultation with interested parties about fundamental rights, Annex B
describes the relationship of this document with other harmonized standards, Annex C contains the
correspondence between the clauses of this document and ISO 9001:2015 [2], and Annex D contains the
correspondence with ISO/IEC 42001:2023 [3].
0.2 Fundamental rights
Fundamental rights are universal legal guarantees without which individuals and groups cannot secure
their fundamental freedoms and human dignity and which apply equally to every human being regardless
of nationality, place of residence, sex, national or ethnic origin, colour, religion, language or any other
status as per the legal system of a country without any conditions.
The EU Charter of Fundamental Rights [4] describes the European view on these fundamental rights.
Further information about their scope and strength can be found in the Charter and in prEN 18228:—
[5], Annex F. Additional EU and country-specific regulatory requirements can apply.
Under preparation. Stage at time of publication, Working Draft.
1 Scope
This document specifies the requirements and provides guidance for the definition, implementation,
maintenance and improvement of a quality management system for organizations that provide AI
systems.
This document is intended to support the organization in meeting applicable regulatory requirements.
2 Normative references
There are no normative references in this document.
3 Terms and definitions
For the purposes of this document, the following terms and definitions apply.
ISO and IEC maintain terminology databases for use in standardization at the following addresses:
— ISO Online browsing platform: available at https://www.iso.org/obp/
— IEC Electropedia: available at https://www.electropedia.org/
3.1 Terms relating to management systems
3.1.1
audit
systematic and independent process (3.1.18) for obtaining evidence and evaluating it objectively to
determine the extent to which the audit criteria are fulfilled
Note 1 to entry: An audit can be an internal audit (first party) or an external audit (second party or third party), and
it can be a combined audit (combining two or more disciplines).
Note 2 to entry: An internal audit is conducted by the organization (3.1.15) itself, or by an external party on its
behalf.
Note 3 to entry: “Audit criteria” is defined in ISO 19011.
[SOURCE: ISO/IEC Directives Part 1, Consolidated ISO Supplement, Annex SL Appendix 2 (rev 4 2024),
3.3, modified – reference to evidence removed from Note 3]
3.1.2
competence
ability to apply knowledge and skills to achieve intended results
[SOURCE: ISO/IEC Directives Part 1, Consolidated ISO Supplement, Annex SL Appendix 2 (rev 4 2024),
3.9]
3.1.3
complaint
statement claiming that an AI system (3.2.1) does not conform to applicable regulatory requirements or
has caused or is causing harm (3.4.3)
3.1.4
conformity
fulfilment of a requirement (3.1.21)
[SOURCE: ISO/IEC Directives Part 1, Consolidated ISO Supplement, Annex SL Appendix 2 (rev 4 2024),
3.15]
3.1.5
continual improvement
recurring activity to enhance performance (3.1.26)
[SOURCE: ISO/IEC Directives Part 1, Consolidated ISO Supplement, Annex SL Appendix 2 (rev 4 2024),
3.12]
3.1.6
corrective action
action to eliminate the cause(s) of a nonconformity (3.1.14) and to prevent recurrence
[SOURCE: ISO/IEC Directives Part 1, Consolidated ISO Supplement, Annex SL Appendix 2 (rev 4 2024),
3.17]
3.1.7
documented information
information controlled and maintained by an organization (3.1.15) and the medium on which it is
contained
Note 1 to entry: Documented information can be in any format and media and from any source.
Note 2 to entry: Documented information can refer to:
a) the management system, including related processes (3.1.18);
b) information created in order for the organization to operate (documentation);
c) evidence of compliance and results achieved.
[SOURCE: ISO/IEC Directives Part 1, Consolidated ISO Supplement, Annex SL Appendix 2 (rev 4 2024),
3.10, modified – removed “required to be” from the definition, note 2 added reference to compliance and
removed reference to records]
3.1.8
effectiveness
extent to which planned activities are realized and planned results are achieved
[SOURCE: ISO/IEC Directives Part 1, Annex SL Appendix 2 (rev 4 2024), 3.13]
3.1.9
harmonized standard
European standard adopted on the basis of a request made by the Commission for the application of
Union harmonization legislation
Note 1 to entry: A European standardization organization can develop a new standard, adopt an existing
international standard or adapt an existing international standard.
[SOURCE: Regulation (EU) 1025/2012, Article 2(1)(c), modified – removed “a”, added Note 1 to entry]
3.1.10
interested party
stakeholder
individual, group or organization (3.1.15) that can affect, be affected by or perceive itself to be affected
by a decision or activity
Note 1 to entry: Affected persons (3.4.1) are a subset of interested parties.
Note 2 to entry: Interested party includes relevant regulatory bodies, national public bodies, bodies that enforce the
protection of fundamental rights, and market surveillance authorities.
[SOURCE: ISO/IEC Directives Part 1, Consolidated ISO Supplement, Annex SL Appendix 2 (rev 4 2024),
3.2 – modified to change individual to person in the definition, add group to the definition and add notes]
3.1.11
life cycle
evolution of a system, product, service, project or other human-made entity, from inception through
retirement
[SOURCE: ISO/IEC/IEEE 15288:2023, 4.1.23, modified conception to inception]
3.1.12
measurement
process (3.1.18) to determine a value
[SOURCE: ISO/IEC Directives Part 1, Annex SL Appendix 2 (rev 4 2024), 3.3]
3.1.13
monitoring
repeatedly determining the status of a system, a process (3.1.18) or an activity using inputs including
measurements (3.1.12)
Note 1 to entry: To determine the status, there can be a need to check, supervise or critically observe.
[SOURCE: ISO/IEC Directives Part 1, Consolidated ISO Supplement, Annex SL Appendix 2 (rev 4 2024),
3.3, modified – added to definition “repeatedly” and “using inputs including measurements”]
3.1.14
nonconformity
non-fulfilment of a requirement (3.1.21)
[SOURCE: ISO/IEC Directives Part 1, Consolidated ISO Supplement, Annex SL Appendix 2 (rev 4 2024),
3.16, modified – note 1 added]
3.1.15
organization
person or group of people that has its own functions with responsibilities, authorities and relationships
to achieve its objectives
Note 1 to entry: The concept of organization includes, but is not limited to, sole-trader, company, corporation, firm,
enterprise, authority, partnership, charity or institution, or part or combination thereof, whether incorporated or
not, public or private.
Note 2 to entry: If the organization is part of a larger entity, the term “organization” refers only to the part of the
larger entity that is within the scope of the quality management system.
[SOURCE: ISO/IEC Directives Part 1, Annex SL Appendix 2 (rev 4 2024), 3.3, modified – note 3 added]
3.1.16
policy
intentions and direction of an organization (3.1.15) as formally expressed by its top management (3.1.25)
[SOURCE: ISO/IEC Directives Part 1, Consolidated ISO Supplement, Annex SL Appendix 2 (rev 4 2024),
3.1]
3.1.17
procedure
specified way to carry out an activity or a process (3.1.18)
Note 1 to entry: Procedures can be documented or not.
[SOURCE: ISO 9000:2015, 3.4.5]
3.1.18
process
set of interrelated or interacting activities that uses or transforms inputs to deliver a result
Note 1 to entry: Whether the result of a process is called an output, a product or a service depends on the context
of the reference.
Note 2 to entry: Process may achieve an immediate result but can also consist of information-sharing or other
activities.
[SOURCE: ISO/IEC Directives Part 1, Consolidated ISO Supplement, Annex SL Appendix 2 (rev 4 2024),
3.8, modified – note 2 added]
3.1.19
quality
set of characteristics of an object (3.1.30) that fulfils regulatory requirements (3.1.27)
Note 1 to entry: Quality includes the protection required by applicable regulatory requirements aimed at ensuring
and maintaining the protection of health, safety and fundamental rights (3.4.2).
Note 2 to entry: In the context of this document, quality pertains to regulatory compliance to the EU AI Act. It differs
from the concept of quality in ISO 9001 which includes expectations of customers.
[SOURCE ISO 9000:2015, 3.6.2, modified remove a note, “degree of”, “inherent” and added “regulatory”,
added notes]
3.1.20
quality policy
policy (3.1.16) related to quality (3.1.19)
[SOURCE ISO 9000:2015, 3.5.9, modified to remove notes to entry]
3.1.21
requirement
need or expectation that is stated and obligatory
Note 1 to entry: A specified requirement is one that is stated, e.g. in a document.
[SOURCE: ISO/IEC Directives Part 1, Consolidated ISO Supplement, Annex SL Appendix 2 (rev 4 2024),
3.14, modified – removed “generally implied” from the definition]
3.1.22
top management
person or group of people who directs and controls an organization (3.1.15) at the highest level
Note 1 to entry: Top management has the power to delegate authority and provide resources within the
organization.
Note 2 to entry: If the scope of the management system covers only part of an organization, then top management
refers to those who direct and control that part of the organization.
Note 3 to entry: In different organizational contexts, top management can be referred to with different terms.
Note 4 to entry: Where the organization is a person, top management is that person.
[SOURCE: ISO/IEC Directives Part 1, Consolidated ISO Supplement, Annex SL Appendix 2 (rev 4 2024),
3.3, modified – note 3 and 4 added]
3.1.23
performance
measurable result
Note 1 to entry: Performance can relate either to quantitative or qualitative findings.
Note 2 to entry: Performance can relate to the management of activities, processes (3.1.18), products, services,
systems or organizations (3.1.15).
[SOURCE: ISO 9000:2015, 3.7.8]
3.1.24
regulatory requirement
requirement (3.1.21) that is necessary to be met for the purposes of complying with the content of
applicable regulation
Note 1 to entry: Applicable regulation includes at least Regulation 2024/1689 (AI Act) [1]
3.1.25
quality objective
measurable goal established to ensure that regulatory requirements (3.1.24) are consistently met
throughout the life cycle (3.1.11)
Note 1 to entry: In the context of quality (3.1.19) management systems, quality objectives are set by the provider
(3.2.5), consistent with the quality policy (3.1.20), to achieve specific results.
3.1.26
systematic
pursuing defined objective(s) in a planned, step-by-step manner
[SOURCE: ISO/TR 18307:2001, 3.140]
3.1.27
scope
set of AI system(s) (3.2.1) that are covered under a quality (3.1.19)
management system and the boundaries that define where the quality management system applies
EXAMPLE Boundaries can include physical (geographic), organizational, functional, process, product, service
and interface boundaries.
3.1.28
verification
confirmation, through the provision of objective evidence, that specified requirements (3.1.21) have been
fulfilled
Note 1 to entry: The objective evidence needed for a verification can be the result of an inspection or of other forms
of determination such as performing alternative calculations or reviewing documents.
Note 2 to entry: The word “verified” is used to designate the corresponding status.
Note 3 to entry: Verification can rely on testing activities and results to provide objective evidence.
Note 4 to entry: Verification activities pertaining to the identification, analysis, evaluation and control of risks
arising from fundamental rights hazard can include:
— consultation with potentially affected stakeholders (or their proxies, including civil society organizations)
identified through stakeholder mapping, taking due account of local variation across the region and the domains in
which the AI system is intended to operate;
— real-world conditions testing to evaluate the effectiveness of risk controls, conducted in accordance with
applicable ethical and regulatory requirements;
— review and evaluation by a cross-functional team of independent experts with appropriate knowledge, skill,
experience and professional expertise;
— consultation with national European or international bodies which supervise or enforce the respect of
obligations under Union law protecting fundamental rights.
[SOURCE: prEN 18228:—, 3.30]
3.1.29
validation
verification (3.1.28) where the specified requirements (3.1.21) are adequate for an intended purpose
(3.2.3)
EXAMPLE A measurement procedure, ordinarily used for the measurement of mass concentration of nitrogen
in water, can be validated also for measurement of mass concentration of nitrogen in human serum.
Note 1 to entry: The concept of validation as a procedure is not directly related to validation datasets used in
machine learning.
[SOURCE: ISO/IEC Guide 99:2007, 2.45, modified — Note 1 to entry added]
3.1.30
object
object of conformity assessment
entity to which specified requirements (3.1.21) apply
EXAMPLE Product, process, service, system, installation, project, data, design, material, claim, person, body or
organization, or any combination thereof.
[SOURCE: ISO/IEC 17000:2020, switched preferred and admitted terms]
3.2 Terms relating to the AI Act
3.2.1
AI system
machine-based system that is designed to operate with varying levels of autonomy and that can exhibit
adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it
receives, how to generate outputs such as predictions, content, recommendations, or decisions that can
influence physical or virtual environments
Note 1 to entry: The verb “can” represents a possibility, not all AI systems that fit the above definition have this
ability to adapt after deployment.
[SOURCE: EU AI Act (Article 3(1)), modified — “a” has been removed, “may” replaced with “can” based
on the use of verbs in standards. Note 1 to entry added.]
3.2.2
deployer
natural or legal person, public authority, agency or other body using an AI system (3.2.1) under its
authority except where the AI system is used in the course of a personal non-professional activity
[SOURCE: EU AI Act 2024/1689 (Article 3(4)), modified removed “a”]
3.2.3
intended purpose
intended use
use for which an AI system (3.2.1) is intended by the organization (3.1.17), including the specific context
and conditions of use, as specified in the information supplied by the organization in the instructions for
use, promotional or sales materials and statements, as well as in the technical documentation
[SOURCE: EU AI Act 2024/1689, (Article 3(12)), modified removed “a”]
3.2.4
performance
ability of an AI system (3.2.1) to achieve its intended purpose (3.2.3)
[SOURCE: EU AI Act 2024/1689, (Article 3(18)), modified removed “the”]
3.2.5
provider
natural or legal person, public authority, agency or other body that develops an AI system (3.2.1) or a
general-purpose AI model or that has an AI system or a general-purpose AI model developed and places
it on the market or puts the AI system into service under its own name or trademark, whether for payment
or free of charge
Note 1 to entry: A distributor, importer, deployer or other third party can be considered a provider of an AI system
in certain circumstances.
[SOURCE: EU AI Act 2024/1689, (Article 3(3)), modified — “a” removed]
3.2.6
reasonably foreseeable misuse
use of an AI system (3.2.1) in a way that is not in accordance with its intended purpose (3.2.3), but which
can result from reasonably foreseeable human behaviour or interaction with other systems, including
other AI systems
[SOURCE: EU AI Act 2024/1689, (Article 3(13)), modified — “a” removed, “may” replaced with “can”
based on the verbal form of standards.]
3.2.7
substantial modification
change to an AI system (3.2.1) after its placing on the market or putting into service which is not foreseen
or planned in the initial conformity (3.1.4) assessment carried out by the provider (3.2.5) and as a result
of which the compliance of the AI system with the applicable regulatory requirements (3.1.24) is affected
or results in a modification to the intended purpose (3.2.3) for which the AI system has been assessed
Note 1 to entry: Rephrased from the AI Act Article (3)(23) to reference applicable regulatory requirements instead
of a specific reference to the AI Act Chapter III, Section 2.
[SOURCE: EU AI Act 2024/1689, (Article 3(23)), modified as described in Note 1 to entry]
3.2.8
essential requirement
definition of the results to be attained, or the hazards (3.4.4) to be dealt with, without specifying the
technical solutions for doing so
Note 1 to entry: Essential requirements in relation to the EU AI Act [1] are specified in Chapter III, Section 2 of that
Regulation.
[SOURCE: The Blue Guide on implementation of EU product rules 20202/C247/01, modified – “but do
not specify” to “without specifying”]
3.2.9
serious incident
incident or malfunctioning of an AI system (3.2.1) that directly or indirectly leads to any of the following:
a) the death of a person, or serious harm (3.4.3) to a person’s health;
b) a serious and irreversible disruption of the management or operation of critical infrastructure;
c) the infringement of obligations under applicable regulatory requirements (3.1.27) intended to
protect fundamental rights (3.4.2);
d) serious harm (3.4.3) to property or the environment.
[SOURCE: EU AI Act 2024/1689, (Article 3(49))]
3.3 Terms relating to AI systems
3.3.1
inference
reasoning by which conclusions are derived from known premises
Note 1 to entry: In AI, a premise is either a fact, a rule, a model, a feature or raw data.
Note 2 to entry: The term “inference” refers both to the process and its result.
[SOURCE: ISO/IEC 22989:2022, 3.1.17]
3.3.2
machine learning algorithm
algorithm to determine parameters of a machine learning model (3.3.3) from data according to given
criteria
EXAMPLE Consider a univariate linear function y = θ0 + θ1x where y is an output or result, x is an input, θ0 is
an intercept (the value of y where x = 0) and θ is a weight. In machine learning, the process of determining the
intercept and weights for a linear function is known as linear regression.
[SOURCE: ISO/IEC 22989:2022, 3.3.6]
3.3.3
machine learning model
mathematical construct that generates an inference (3.3.1) or prediction (3.3.4) based on input data or
information
EXAMPLE If a univariate linear function (y = θ + θ x) has been trained using linear regression, the resulting
0 1
model can be y = 3 + 7x.
Note 1 to entry: A machine learning model results from training based on a machine learning algorithm
[SOURCE: ISO/IEC 22989:2022, 3.3.7]
3.3.4
prediction
primary output of an AI system (3.2.1) when provided with input data or information
Note 1 to entry: Predictions can be followed by additional outputs, such as recommendations, decisions and actions.
Note 2 to entry: Prediction does not necessarily refer to predicting something in the future.
Note 3 to entry: Predictions can refer to various kinds of data analysis or production applied to new data or
historical data (including translating text, creating synthetic images or diagnosing a previous power failure).
[SOURCE: ISO/IEC 22989:2022, 3.1.27]
3.3.5
training
model training
process (3.1.18) to determine or to improve the parameters of a machine learning model (3.3.3), based on
a machine learning algorithm (3.3.2), by using training data
[SOURCE: ISO/IEC 22989:2022, 3.3.15]
3.3.6
traceability
ability to trace the history of the AI system (3.2.1)
Note 1 to entry: Traceability includes information on how AI systems have been specified, developed, verified,
validated, operated, monitored and retired.
3.3.7
AI system requirements
functional and non-functional requirements (3.1.21) derived from regulatory requirements (3.1.27)
3.3.8
test procedure
sequence of test cases in execution order, with any associated actions required to set up preconditions
and perform wrap up activities post execution
[SOURCE: ISO/IEC 29119-1:2022, 3.1.20]
3.4 Terms related to risk management
3.4.1
affected person
individual, or group of individuals, who is or are directly or indirectly impacted by an AI system (3.2.1)
when used in accordance with its intended purpose (3.2.3) or in the frame of reasonably foreseeable misuse
(3.2.6)
[SOURCE: prEN 18228:—, 3.41]
3.4.2
fundamental rights
basic right(s) and freedom(s) held by every human being irrespective of birth, religion, belief, age, race,
ethnicity, sex, gender or any other status
Note 1 to entry: For the purposes of this document, fundamental rights and their applicability are those protected
by EU law, including the protection of the rights outlined in EU law, including the Charter of Fundamental Rights of
the EU (EU Charter) and the European Convention on Human Rights.
Note 2 to entry: prEN 18228, Annex D and Annex F provide information about other sources of applicable law
governing fundamental rights.
[SOURCE: prEN 18228:—, 3.36]
3.4.3
harm
injury or damage to health or interference with the fundamental rights (3.4.2) of a person or group of
persons, or damage to property or the environment
Note 1 to entry: Harm can be material or immaterial, including physical, psychological, societal or economic harm.
[SOURCE: prEN 18228:—, 3.3, modified to add note]
3.4.4
hazard
potential source of harm (3.4.3)
[SOURCE: ISO/IEC Guide 51:2014, 3.2]
3.4.5
risk
combination of the probability of an occurrence of harm (3.4.3) and the severity (3.4.8) of that event
Note 1 to entry: The probability of occurrence includes the exposure to a hazardous situation and the possibility to
avoid or limit the harm.
Note 2 to entry: Risk includes harm to the health, safety and interference of fundamental rights (3.4.2) directly or
indirectly impacted by hazardous situations created where an AI system (3.2.1) is involved.
[SOURCE: prEN 18228:—, 3.19, modified to remove note 3]
3.4.6
risk control
process (3.1.18) in which decisions are made and measures implemented by which risks (3.4.5) are
reduced to, or maintained within, specified levels
[SOURCE: ISO/IEC Guide 63:2019, 3.12]
3.4.7
risk management
systematic (3.1.26) and continuous application of management policies (3.1.16), procedures (3.1.17) and
practices to the tasks of analysing, evaluating, controlling and monitoring (3.1.13) risk (3.4.5) throughout
the entire life cycle (3.1.11) of an AI system (3.2.1)
[SOURCE: prEN 18228:—, 3.25]
3.4.8
severity
measure of the possible consequences of harm (3.4.3)
Note 1 to entry: The definition does not imply numerical measure of severity.
Note 2 to entry: For any risk (3.4.5) to fundamental rights (3.4.2), the severity of risk includes consideration of the
nature of the harm (3.4.3), the strength of the harm, the significance and scale of the harm in terms of the number
of individuals and groups of individuals whose rights are placed at risk, the irremediability of the harm and whether
the rights at risk are those of persons under the age of 18 and others who are disproportionately at risk from the
use of the system.
[SOURCE: prEN 18228:—, 3.27]
4 Quality management system
4.1 General
The provider shall establish, maintain and continually improve the quality management system in
accordance with the requirements of this document, and in order to protect health, safety and
fundamental rights.
The provider shall establish, document, implement and maintain any process, procedure and activity
necessary to maintain the quality management system and its effectiveness in meeting applicable
regulatory requirements throughout the applicable stages of the life cycle.
4.2 Identifying regulatory requirements
The provider shall determine and systematically review the regulatory requirements that these AI systems
must comply with, at any point of their life cycle.
NOTE This includes at least the essential requirements, as explained in 4.4.2.
The regulatory requirements identified shall be integrated into the strategy for regulatory compliance
referred to in 4.4.
4.3 Determining the scope of the quality management system
The provider shall determine the scope of the quality management system, by:
a) determining the set of AI system(s) that are covered under the quality management system;
b) defining the boundaries, taking into account:
1) the regulatory requirements referred to in 4.2;
2) the intended purpose of the AI system(s).
4.4 Strategy for regulatory compliance
4.4.1 Determining the strategy
The provider shall determine a strategy for compliance with regulatory requirements including at least
the following elements:
a) compliance with the regulatory requirements for this quality management system, in accordance with
this document;
b) compliance with essential requirements (see 4.4.2);
c) compliance with the regulatory requirements for post-market monitoring, in accordance with 9.4;
d) compliance with the regulatory requirements in relation to serious incidents, in accordance with 9.5;
e) the strategy for data management, in accordance with 8.5.
The strategy shall be available as documented information, in accordance with 4.5.
4.4.2 Essential requirements
Applicable Union harmonization legislation defines the essential requirements of products. They are
written in a way that supports conformity assessment, even in the absence of harmonized standards.
The essential requirements are those for:
a) the risk management system;
b) data and data governance;
c) technical documentation;
d) record-keeping;
e) transparency and provision of information to deployers;
f) human oversight;
g) accuracy, robustness and cybersecurity.
NOTE These essential requirements are found in Chapter III, Section 2 of the AI Act [1].
4.4.3 Selecting and documenting measures to demonstrate compliance
4.4.3.1 Selecting approaches
When demonstrating compliance the provider shall select one of, or a combination of the following
approaches that provide compliance with each applicable essential requirement:
a) harmonized standards (see Annex B) that have been cited in the Official Journal;
b) common specifications that have been adopted in an implementing act;
c) other standards;
EXAMPLE 1 EN 62586-2 [14] specifies what level of aggregated data to log for each type of event versus
raw input data measurements.
d) other technical specifications or solutions.
EXAMPLE 2 A provider that produces AI-enabled fall detectors uses a technical solution developed
internally (or one recommended by an industry body) based on an in-depth risk assessment of that use case,
and that technical approach provides specific requirements for conformity with the regulatory requirements.
4.4.3.2 Selecting measures
...








Questions, Comments and Discussion
Ask us and Technical Secretary will try to provide an answer. You can facilitate discussion about the standard in here.