ISO/IEC/IEEE 29119-4:2015
(Main)Software and systems engineering - Software testing - Part 4: Test techniques
Software and systems engineering - Software testing - Part 4: Test techniques
ISO/IEC/IEEE 29119-4:2015 defines test design techniques that can be used during the test design and implementation process that is defined in ISO/IEC/IEEE 29119‑2. ISO/IEC/IEEE 29119-4:2015 is intended for, but not limited to, testers, test managers, and developers, particularly those responsible for managing and implementing software testing.
Ingénierie du logiciel et des systèmes — Essais du logiciel — Partie 4: Techniques d'essai
General Information
Relations
Frequently Asked Questions
ISO/IEC/IEEE 29119-4:2015 is a standard published by the International Organization for Standardization (ISO). Its full title is "Software and systems engineering - Software testing - Part 4: Test techniques". This standard covers: ISO/IEC/IEEE 29119-4:2015 defines test design techniques that can be used during the test design and implementation process that is defined in ISO/IEC/IEEE 29119‑2. ISO/IEC/IEEE 29119-4:2015 is intended for, but not limited to, testers, test managers, and developers, particularly those responsible for managing and implementing software testing.
ISO/IEC/IEEE 29119-4:2015 defines test design techniques that can be used during the test design and implementation process that is defined in ISO/IEC/IEEE 29119‑2. ISO/IEC/IEEE 29119-4:2015 is intended for, but not limited to, testers, test managers, and developers, particularly those responsible for managing and implementing software testing.
ISO/IEC/IEEE 29119-4:2015 is classified under the following ICS (International Classification for Standards) categories: 35.080 - Software. The ICS classification helps identify the subject area and facilitates finding related standards.
ISO/IEC/IEEE 29119-4:2015 has the following relationships with other standards: It is inter standard links to ISO/IEC/IEEE 29119-4:2021. Understanding these relationships helps ensure you are using the most current and applicable version of the standard.
You can purchase ISO/IEC/IEEE 29119-4:2015 directly from iTeh Standards. The document is available in PDF format and is delivered instantly after payment. Add the standard to your cart and complete the secure checkout process. iTeh Standards is an authorized distributor of ISO standards.
Standards Content (Sample)
DRAFT INTERNATIONAL STANDARD
ISO/IEC/IEEE DIS 29119-4
Attributed to ISO/IEC JTC 1 by the Central Secretariat (see page iii)
ISO/IEC voting begins on: ISO/IEC voting terminates on:
2013-07-16 2013-10-16
Software & Systems Engineering Standards Committee
of the IEEE Computer Society
INTERNATIONAL ORGANIZATION FOR STANDARDIZATION • МЕЖДУНАРОДНАЯ ОРГАНИЗАЦИЯ ПО СТАНДАРТИЗАЦИИ • ORGANISATION INTERNATIONALE DE NORMALISATION
INTERNATIONAL ELECTROTECHNICAL COMMISSION • МЕЖДУНАРОДНАЯ ЭЛЕКТРОТЕХНИЧЕСКАЯ КОММИСИЯ • COMMISSION ÉLECTROTECHNIQUE INTERNATIONALE
Software and systems engineering — Software testing —
Part 4:
Test techniques
Ingénierie du logiciel et des systèmes — Essais du logiciel —
Partie 4: Techniques des essais
ICS 35.080
This document was developed under the Partner Standards Development Organization
cooperation agreement between ISO and IEEE, as approved by Council Resolution 49/2007,
and is submitted to a parallel enquiry vote by the ISO/IEC national bodies and IEEE.
In accordance with the provisions of Council Resolution 21/1986 this document is circulated
in the English language only.
Conformément aux dispositions de la Résolution du Conseil 21/1986, ce document est
distribué en version anglaise seulement.
THIS DOCUMENT IS A DRAFT CIRCULATED FOR COMMENT AND APPROVAL. IT IS THEREFORE SUBJECT TO CHANGE AND MAY NOT BE REFERRED TO
AS AN INTERNATIONAL STANDARD UNTIL PUBLISHED AS SUCH. BECAUSE IT IS AN UNAPPROVED DRAFT, THIS DOCUMENT SHALL NOT BE USED FOR
ANY CONFORMANCE/COMPLIANCE PURPOSES.
IN ADDITION TO THEIR EVALUATION AS BEING ACCEPTABLE FOR INDUSTRIAL, TECHNOLOGICAL, COMMERCIAL AND USER PURPOSES, DRAFT
INTERNATIONAL STANDARDS MAY ON OCCASION HAVE TO BE CONSIDERED IN THE LIGHT OF THEIR POTENTIAL TO BECOME STANDARDS TO WHICH
REFERENCE MAY BE MADE IN NATIONAL REGULATIONS.
RECIPIENTS OF THIS DRAFT ARE INVITED TO SUBMIT, WITH THEIR COMMENTS, NOTIFICATION OF ANY RELEVANT PATENT RIGHTS OF WHICH THEY
ARE AWARE AND TO PROVIDE SUPPORTING DOCUMENTATION.
© International Organization for Standardization, 2013
© International Electrotechnical Commission, 2013
© IEEE 2013
ISO/IEC/IEEE DIS 29119-4:2013
Copyright notice
This ISO/IEC/IEEE document is a Draft International Standard and is copyright-protected by ISO, IEC and
IEEE. Except as permitted under the applicable laws of the user’s country, neither this ISO/IEC/IEEE draft
nor any extract from it may be reproduced, stored in a retrieval system or transmitted in any form or by any
means, electronic, photocopying, recording or otherwise, without prior written permission being secured.
Requests for permission to reproduce should be addressed to either ISO or IEC at the addresses below or
their member bodies in the country of the requester. In the United States, such requests should be sent to
IEEE.
ISO copyright office IEC Central Office Institute of Electrical and
Case postale 56 3, rue de Varembé Electronics Engineers, Inc.
CH-1211 Geneva 20 CH-1211 Geneva 20 3 Park Avenue, New York
Tel. + 41 22 749 01 11 Switzerland NY 10016-5997, USA
Fax + 41 22 749 09 47 E-mail inmail@iec.ch E-mail stds.ipr@ieee.org
E-mail copyright@iso.org Web www.iec.ch Web www.ieee.org
Web www.iso.org
Reproduction may be subject to royalty payments or a licensing agreement.
Violators may be prosecuted.
© ISO/IEC 2013 – All rights reserved
ii © IEEE 2013 – All rights reserved
ISO/IEC/IEEE DIS 29119-4:2013
Foreword
ISO (the International Organization for Standardization) and IEC (the International Electrotechnical
Commission) form the specialized system for worldwide standardization. National bodies that are members of
ISO or IEC participate in the development of International Standards through technical committees established
by the respective organization to deal with particular fields of technical activity. ISO and IEC technical
committees collaborate in fields of mutual interest. Other international organizations, governmental and non-
governmental, in liaison with ISO and IEC, also take part in the work. In the field of information technology, ISO
and IEC have established a joint technical committee, ISO/IEC JTC 1.
IEEE Standards documents are developed within the IEEE Societies and the Standards Coordinating
Committees of the IEEE Standards Association (IEEE-SA) Standards Board. The IEEE develops its standards
through a consensus development process, approved by the American National Standards Institute, which
brings together volunteers representing varied viewpoints and interests to achieve the final product. Volunteers
are not necessarily members of the Institute and serve without compensation. While the IEEE administers the
process and establishes rules to promote fairness in the consensus development process, the IEEE does not
independently evaluate, test, or verify the accuracy of any of the information contained in its standards.
International Standards are drafted in accordance with the rules given in the ISO/IEC Directives, Part 2.
The main task of ISO/IEC JTC 1 is to prepare International Standards. Draft International Standards adopted
by the joint technical committee are circulated to national bodies for voting. Publication as an International
Standard requires approval by at least 75 % of the national bodies casting a vote.
Attention is called to the possibility that implementation of this standard may require the use of subject matter
covered by patent rights. By publication of this standard, no position is taken with respect to the existence or
validity of any patent rights in connection therewith. ISO/IEEE is not responsible for identifying essential
patents or patent claims for which a license may be required, for conducting inquiries into the legal validity or
scope of patents or patent claims or determining whether any licensing terms or conditions provided in
connection with submission of a Letter of Assurance or a Patent Statement and Licensing Declaration Form, if
any, or in any licensing agreements are reasonable or non-discriminatory. Users of this standard are expressly
advised that determination of the validity of any patent rights, and the risk of infringement of such rights, is
entirely their own responsibility. Further information may be obtained from ISO or the IEEE Standards
Association.
ISO/IEC/IEEE 29119-4 was prepared by Joint Technical Committee ISO/IEC JTC 1, Information technology,
Subcommittee SC 7, Software and systems engineering, in cooperation with the Software & Systems
Engineering Standards Committee of the IEEE Computer Society, under the Partner Standards Development
Organization cooperation agreement between ISO and IEEE.
ISO/IEC/IEEE 29119 consists of the following parts, under the general title Software and systems
engineering — Software testing:
— Part 2: Test process
— Part 3: Test documentation
The following parts are under preparation:
— Part 1: Concepts and definitions
— Part 3: Test techniques
© ISO/IEC 2013 – All rights reserved
© IEEE 2013 – All rights reserved iii
ISO/IEC/IEEE DIS 29119-4:2013
IEEE Notice to Users
Notice and Disclaimer of Liability Concerning the Use of IEEE Documents: IEEE Standards documents are developed within the IEEE
Societies and the Standards Coordinating Committees of the IEEE Standards Association (IEEE-SA) Standards Board. IEEE develops its
standards through a consensus development process, approved by the American National Standards Institute, which brings together volunteers
representing varied viewpoints and interests to achieve the final product. Volunteers are not necessarily members of the Institute and serve
without compensation. While IEEE administers the process and establishes rules to promote fairness in the consensus development process,
IEEE does not independently evaluate, test, or verify the accuracy of any of the information or the soundness of any judgments contained in its
standards.
Use of an IEEE Standard is wholly voluntary. IEEE disclaims liability for any personal injury, property or other damage, of any nature
whatsoever, whether special, indirect, consequential, or compensatory, directly or indirectly resulting from the publication, use of, or reliance
upon any IEEE Standard document.
IEEE does not warrant or represent the accuracy or content of the material contained in its standards, and expressly disclaims any express or
implied warranty, including any implied warranty of merchantability or fitness for a specific purpose, or that the use of the material contained
in its standards is free from patent infringement. IEEE Standards documents are supplied "AS IS."
The existence of an IEEE Standard does not imply that there are no other ways to produce, test, measure, purchase, market, or provide other
goods and services related to the scope of the IEEE standard. Furthermore, the viewpoint expressed at the time a standard is approved and
issued is subject to change brought about through developments in the state of the art and comments received from users of the standard. Every
IEEE standard is subjected to review at least every ten years. When a document is more than ten years old and has not undergone a revision
process, it is reasonable to conclude that its contents, although still of some value, do not wholly reflect the present state of the art. Users are
cautioned to check to determine that they have the latest edition of any IEEE standard.
In publishing and making its standards available, IEEE is not suggesting or rendering professional or other services for, or on behalf of, any
person or entity. Nor is IEEE undertaking to perform any duty owed by any other person or entity to another. Any person utilizing any IEEE
Standards document, should rely upon his or her own independent judgment in the exercise of reasonable care in any given circumstances or,
as appropriate, seek the advice of a competent professional in determining the appropriateness of a given IEEE standard.
Translations: The IEEE consensus development process involves the review of documents in English only. In the event that an IEEE standard
is translated, only the English version published by IEEE should be considered the approved IEEE standard.
Official Statements: A statement, written or oral, that is not processed in accordance with the IEEE-SA Standards Board Operations Manual
shall not be considered the official position of IEEE or any of its committees and shall not be considered to be, nor be relied upon as, a formal
position of IEEE. At lectures, symposia, seminars, or educational courses, an individual presenting information on IEEE standards shall make
it clear that his or her views should be considered the personal views of that individual rather than the formal position of IEEE.
Comments on Standards: Comments for revision of IEEE Standards documents are welcome from any interested party, regardless of
membership affiliation with IEEE. However, IEEE does not provide consulting information or advice pertaining to IEEE Standards documents.
Suggestions for changes in documents should be in the form of a proposed change of text, together with appropriate supporting comments.
Since IEEE standards represent a consensus of concerned interests, it is important to ensure that any responses to comments and questions also
receive the concurrence of a balance of interests. For this reason, IEEE and the members of its societies and Standards Coordinating
Committees are not able to provide an instant response to comments or questions except in those cases where the matter has previously been
addressed. Any person who would like to participate in evaluating comments or revisions to an IEEE standard is welcome to join the relevant
IEEE working group at http://standards.ieee.org/develop/wg/.
Comments on standards should be submitted to the following address:
Secretary, IEEE-SA Standards Board
445 Hoes Lane
Piscataway, NJ 08854-4141
USA
Photocopies: Authorization to photocopy portions of any individual standard for internal or personal use is granted by The Institute of
Electrical and Electronics Engineers, Inc., provided that the appropriate fee is paid to Copyright Clearance Center. To arrange for payment of
licensing fee, please contact Copyright Clearance Center, Customer Service, 222 Rosewood Drive, Danvers, MA 01923 USA; +1 978 750
8400. Permission to photocopy portions of any individual standard for educational classroom use can also be obtained through the Copyright
Clearance Center.
Patents: Attention is called to the possibility that implementation of this standard may require use of subject matter covered by patent rights.
By publication of this standard, no position is taken with respect to the existence or validity of any patent rights in connection therewith. If a
patent holder or patent applicant has filed a statement of assurance via an Accepted Letter of Assurance, then the statement is listed on the
IEEE-SA Website http://standards.ieee.org/about/sasb/patcom/patents.html. Letters of Assurance may indicate whether the Submitter is
willing or unwilling to grant licenses under patent rights without compensation or under reasonable rates, with reasonable terms and conditions
that are demonstrably free of any unfair discrimination to applicants desiring to obtain such licenses.
Essential Patent Claims may exist for which a Letter of Assurance has not been received. The IEEE is not responsible for identifying Essential
Patent Claims for which a license may be required, for conducting inquiries into the legal validity or scope of Patents Claims, or determining
whether any licensing terms or conditions provided in connection with submission of a Letter of Assurance, if any, or in any licensing
agreements are reasonable or non-discriminatory. Users of this standard are expressly advised that determination of the validity of any patent
rights, and the risk of infringement of such rights, is entirely their own responsibility. Further information may be obtained from the IEEE
Standards Association.
IMPORTANT NOTICE: IEEE Standards documents are not intended to ensure safety, health, or environmental protection, or ensure
against interference with or from other devices or networks. Implementers of IEEE Standards documents are responsible for determining
and complying with all appropriate safety, security, environmental, health, and interference protection practices and all applicable laws
and regulations.
This IEEE document is made available for use subject to important notices and legal disclaimers. These notices and disclaimers appear
in all publications containing this document and may be found under the heading “Important Notice” or “Important Notices and
Disclaimers Concerning IEEE Documents.” They can also be obtained on request from IEEE or viewed at
http://standards.ieee.org/IPR/disclaimers.html.
© ISO/IEC 2013 – All rights reserved
iv © IEEE 2013 – All rights reserved
ISO/IEC DIS 29119-4
Contents Page
Foreword . vi
Introduction . vii
1 Scope . 1
2 Conformance . 1
2.1.1 Intended Usage . 1
2.1.2 Full Conformance . 1
2.1.3 Tailored Conformance . 1
3 Normative References . 2
4 Terms and Definitions . 2
5 Test Design Techniques . 5
5.1 Overview . 5
5.2 Specification-Based Test Design Techniques. 8
5.2.1 Equivalence Partitioning . 8
5.2.2 Classification Tree Method . 9
5.2.3 Boundary Value Analysis . 10
5.2.4 Syntax Testing . 12
5.2.5 Combinatorial Test Design Techniques . 13
5.2.6 Decision Table Testing . 15
5.2.7 Cause-Effect Graphing . 16
5.2.8 State Transition Testing . 17
5.2.9 Scenario Testing . 18
5.3 Structure-Based Test Design Techniques . 19
5.3.1 Statement Testing . 19
5.3.2 Branch Testing . 20
5.3.3 Decision Testing . 20
5.3.4 Branch Condition Testing. 21
5.3.5 Branch Condition Combination Testing . 21
5.3.6 Modified Condition Decision Coverage (MCDC) Testing . 22
5.3.7 Data Flow Testing . 23
5.4 Experience-Based Testing. 25
5.4.1 Error Guessing . 25
6 Test Coverage Measurement . 26
6.1 Test Measurement for Specification-Based Test Design Techniques . 26
6.1.1 Overview . 26
6.1.2 Equivalence Partition Coverage . 26
6.1.3 Classification Tree Method Coverage . 27
6.1.4 Boundary Value Analysis Coverage . 27
6.1.5 Syntax Testing Coverage. 27
6.1.6 Combinatorial Test Design Technique Coverage . 27
6.1.7 Decision Table Testing Coverage . 28
6.1.8 Cause-Effect Graphing Coverage . 28
6.1.9 State Transition Testing Coverage . 29
6.1.10 Scenario Testing Coverage . 29
6.1.11 Random Testing Coverage . 29
6.2 Test Measurement for Structure-Based Test Design Techniques . 29
6.2.1 Statement Testing Coverage . 29
6.2.2 Branch Testing Coverage . 30
6.2.3 Decision Testing Coverage . 30
6.2.4 Branch Condition Testing Coverage . 30
6.2.5 Branch Condition Combination Testing Coverage . 30
© ISO/IEC 2013 – All rights reserved iii
ISO/IEC DIS 29119-4
6.2.6 Modified Condition Decision (MCDC) Testing Coverage. 31
6.2.7 Data Flow Testing Coverage . 31
6.3 Test Measurement for Experience-Based Testing Design Techniques . 32
6.3.1 Error Guessing Coverage . 32
Annex A (informative) Testing Quality Characteristics . 33
A.1 Quality Characteristics . 33
A.1.1 Overview . 33
A.2 Quality-Related Types of Testing . 34
A.2.1 Accessibility Testing . 34
A.2.2 Backup/Recovery Testing . 34
A.2.3 Compatibility Testing . 34
A.2.4 Conversion Testing . 34
A.2.5 Disaster Recovery Testing . 35
A.2.6 Functional Testing . 35
A.2.7 Installability Testing . 35
A.2.8 Interoperability Testing . 35
A.2.9 Localization Testing . 35
A.2.10 Maintainability Testing . 35
A.2.11 Performance-Related Testing . 36
A.2.12 Portability Testing. 36
A.2.13 Procedure Testing . 36
A.2.14 Reliability Testing . 37
A.2.15 Security Testing . 37
A.2.16 Stability Testing . 38
A.2.17 Usability Testing . 38
A.3 Mapping Quality Characteristics to Types of Testing. 38
A.4 Mapping Quality Characteristics to Test Design Techniques . 39
A.4.1 Mapping . 39
Annex B (informative) Guidelines and Examples for the Application of Specification-Based Test
Design Techniques . 42
B.1 Guidelines and Examples for Specification-Based Testing . 42
B.1.1 Overview . 42
B.2 Specification-Based Test Design Technique Examples . 42
B.2.1 Equivalence Partitioning . 42
B.2.2 Classification Tree Method . 49
B.2.3 Boundary Value Analysis . 52
B.2.4 Syntax Testing . 60
B.2.5 Combinatorial Test Design Techniques . 64
B.2.6 Decision Table Testing . 70
B.2.7 Cause-Effect Graphing . 72
B.2.8 State Transition Testing . 76
B.2.9 Scenario Testing . 81
B.2.10 Use Case Testing . 88
B.2.11 Random Testing . 92
Annex C (informative) Guidelines and Examples for the Application of Structure-Based Test
Design Techniques . 95
C.1 Guidelines and Examples for Structure-Based Testing . 95
C.1.1 Overview . 95
C.2 Structure-Based Test Design Technique Examples . 95
C.2.1 Statement Testing . 95
C.2.2 Branch / Decision Testing . 98
C.2.3 Branch Condition Testing, Branch Condition Combination Testing and Modified
Condition Decision Coverage (MCDC) Testing . 102
C.2.4 Data Flow Testing . 108
Annex D (informative) Guidelines and Examples for the Application of Experience-Based Test
Design Techniques . 117
D.1 Guidelines and Examples for Experience-Based Testing . 117
D.1.1 Overview . 117
iv © ISO/IEC 2013 – All rights reserved
ISO/IEC DIS 29119-4
D.2 Experience-Based Test Design Technique Examples . 117
D.2.1 Error Guessing . 117
Annex E (informative) Test Design Technique Coverage Effectiveness . 120
E.1 Test Design Technique Coverage Effectiveness . 120
E.1.1 Guidance . 120
Bibliography . 122
© ISO/IEC 2013 – All rights reserved v
ISO/IEC DIS 29119-4
Foreword
ISO (the International Organization for Standardization) is a worldwide federation of national standards bodies
(ISO member bodies). The work of preparing International Standards is normally carried out through ISO
technical committees. Each member body interested in a subject for which a technical committee has been
established has the right to be represented on that committee. International organizations, governmental and
non-governmental, in liaison with ISO, also take part in the work. ISO collaborates closely with the
International Electrotechnical Commission (IEC) on all matters of electrotechnical standardization.
International Standards are drafted in accordance with the rules given in the ISO/IEC Directives, Part 2.
The main task of technical committees is to prepare International Standards. Draft International Standards
adopted by the technical committees are circulated to the member bodies for voting. Publication as an
International Standard requires approval by at least 75 % of the member bodies casting a vote.
Attention is drawn to the possibility that some of the elements of this document may be the subject of patent
rights. ISO shall not be held responsible for identifying any or all such patent rights.
ISO/IEC 29119-4 was prepared by Technical Committee ISO/IEC/TC JTC1, Information Technology,
Subcommittee SC SC7, Software and Systems Engineering.
ISO/IEC 29119 consists of the following parts, under the general title Software and Systems Engineering —
Software Testing:
Part 1: Concepts and Definitions
Part 2: Test Processes
Part 3: Test Documentation
Part 4: Test Techniques
vi © ISO/IEC 2013 – All rights reserved
ISO/IEC DIS 29119-4
Introduction
The purpose of ISO/IEC 29119-4 Test Techniques is to provide an International Standard that defines
software test design techniques (also known as test case design techniques or test methods) that can be used
during the test design and implementation process that is defined in ISO/IEC 29119-2 Test Processes.
ISO/IEC 29119-4 does not prescribe a process for test design and implementation; instead, it describes a set
of techniques that can be used within ISO/IEC 29119-2. The intent is to describe a series of techniques that
have wide acceptance in the software testing industry.
The test design techniques presented in ISO/IEC 29119-4 can be used to derive test cases that, when
executed, can be used to collect evidence that test item requirements have been met and/or that defects are
present in a test item (i.e. that requirements have not been met). Risk-based testing could be used to
determine the set of techniques that are applicable in specific situations (risk-based testing is covered in
ISO/IEC 29119-1 and ISO/IEC 29119-2).
Each technique follows the test design and implementation process that is defined in ISO/IEC 29119-2 and
shown below in Figure 1. Of the activities in this process, ISO/IEC 29119-4 provides guidance on how to
implement the following activities in detail for each technique that is described:
Derive Test Conditions (TD2),
Derive Test Coverage Items (TD3), and
Derive Test Cases (TD4).
A test condition is a testable aspect of a test item, such as a function, transaction, feature, quality attribute or
structural element identified as a basis for testing. This determination can be achieved by agreeing with
stakeholders which attributes are to be tested or by applying one or more test design techniques.
NOTE The value of test results and test coverage calculations can be diminished if test conditions do not reflect
requirements in enough detail.
EXAMPLE 1 If a test completion criterion for state transition testing was identified that required coverage of all states
then the test conditions could be the states the test item can be in. Other examples of test conditions are equivalence
classes and boundaries between them or decisions in the code.
Test coverage items are attributes of each test condition that can be covered during testing. A single test
condition may be the basis for one or more test coverage items.
EXAMPLE 2 If a specific boundary is identified as a test condition then the corresponding test coverage items could be
the boundary itself and immediately either side of the boundary.
A test case is a set of preconditions, inputs (including actions, where applicable) and expected results,
developed to determine whether or not the covered part of the test item has been implemented correctly.
Specific (normative) guidance on how to implement the other activities in the test design & implementation
process of ISO/IEC 29119-2, including activities TD1 (Identify Feature Sets), TD5 (Assemble Test Sets) and
TD6 (Derive Test Procedures) is not included in clauses 5 or 6 of this standard because the process is the
same for all techniques.
ISO/IEC TR 19759 (SWEBOK) defines two types of requirements: functional requirements and quality
requirements. ISO/IEC 25010 (ISO/IEC 25010:2011) defines eight quality characteristics (including
functionality) that can be used to identify types of testing that may be applicable for testing a specific test item.
Annex A provides example mappings of test design techniques that apply to testing quality characteristics
defined in ISO/IEC 25010.
© ISO/IEC 2013 – All rights reserved vii
ISO/IEC DIS 29119-4
Test Design &
Implementation Process
Test Design
Specification
Feature
Identify
Sets
Feature
Sets
(TD1)
Test Conditions
Derive Test
Conditions Test Case
Test Specification
(TD2)
Derive
Coverage
Test Items
Coverage
Items
(TD3)
Derive
Test Cases
Test
Test Cases
Procedure
(TD4)
Specification
Test
Assemble
Sets
Test Sets
(TD5)
Test Procedures
Derive Test
& Test Scripts
Inputs to activities in this process Procedures
may include:
(TD6)
The process is shown as purely
• Test basis;
sequential, but in practice it may
• Test plan;
be carried out iteratively, with
• Test strategy;
some activities being revisited.
• Test items; and
See text for details.
• Test design techniques.
Figure 1 – ISO/IEC 29119-2 Test Design and Implementation Process
Experience-based testing practices like exploratory testing and other test practices such as model-based
testing are not defined in ISO/IEC 29119-4 because this standard only describes techniques for designing test
cases. Test practices such as exploratory testing are described in ISO/IEC 29119-1.
Templates and examples of test documentation that are produced during the testing process are defined in
ISO/IEC 29119-3 Test Documentation. The test techniques in ISO/IEC 29119-4 do not describe how test
cases should be documented (e.g. they do not include information or guidance on assigning unique identifiers,
test case descriptions, priorities, traceability or pre-conditions). Information on how to document test cases
can be found in ISO/IEC 29119-3.
This standard aims to provide stakeholders with the ability to perform software testing in any organization.
viii © ISO/IEC 2013 – All rights reserved
DRAFT INTERNATIONAL STANDARD ISO/IEC DIS 29119-4
Software and Systems Engineering — Software Testing — Part
4: Test Techniques
1 Scope
ISO/IEC 29119-4 defines test design techniques that can be used during the test design and implementation
process that is defined in ISO/IEC 29119-2.
This document is intended for, but not limited to, testers, test managers and developers, particularly those
responsible for managing and implementing software testing.
2 Conformance
2.1.1 Intended Usage
The normative requirements in ISO/IEC 29119-4 are contained in clauses 5 and 6. It is recognized that
particular projects or organizations may not need to use all of the techniques defined by this standard.
Therefore, implementation of this standard typically involves selecting set of techniques suitable for the project
or organization. There are two ways that an organizations or individual can claim conformance to the
provisions of this standard. The organization shall assert whether it is claiming full or tailored conformance to
this standard.
2.1.2 Full Conformance
Full conformance is achieved by demonstrating that all of the requirements (i.e. shall statements) of the
chosen (non-empty) set of techniques have been satisfied.
EXAMPLE An organization could choose to conform only to one technique, such as boundary value analysis. In this
scenario, the organization would only be required to provide evidence that they have met the requirements of that one
technique in order to claim conformance to ISO/IEC 29119-4.
2.1.3 Tailored Conformance
Tailored conformance is achieved by demonstrating that the chosen subset of requirements from the chosen
(non-empty) set of techniques have been satisfied. Where tailoring occurs, justification shall be provided
whenever the normative requirements of a technique defined in clauses 5 and 6 are not followed completely
(either directly or by reference). All tailoring decisions shall be recorded with their rationale, including the
consideration of any applicable risks. Tailoring shall be agreed by the relevant stakeholders.
Any alternate test design technique that an organization wishes to claim conformance to (that is not already
defined in this standard) shall satisfy the following criteria:
The technique shall be freely available in the public domain.
A source reference shall be provided.
The technique shall be documented in the same manner as the other test techniques in clause 5.
© ISO/IEC 2013 – All rights reserved 1
ISO/IEC DIS 29119-4
Associated test measurement techniques shall be documented in accordance with clause 6.1.1, if
technically feasible.
3 Normative References
ISO/IEC 29119-4 does not require the use of any external normative references (i.e. there are no external
standards or other referenced documents cited within “shall” statements of this standard that make them
indispensable for the application of this standard). Standards useful for the implementation and interpretation
of ISO/IEC 29119-4 are listed in the bibliography.
4 Terms and Definitions
For the purposes of this document, the terms and definitions given in ISO/IEC/IEEE 24765 Systems and
software engineering — Vocabulary and the following apply.
NOTE Use of the terminology in ISO/IEC 29119-4 is for ease of reference and is not mandatory for conformance with
the standard. The following terms and definitions are provided to assist with the understanding and readability of ISO/IEC
29119-4. Only terms critical to the understanding of ISO/IEC 29119-4 are included. This clause is not intended to provide
a complete list of testing terms. The systems and software engineering vocabulary ISO/IEC/IEEE 24765 can be
referenced for terms not defined in this clause. All terms defined in this clause are also intentionally included in ISO/IEC
29119-1, as that standard includes all terms that are used in ISO/IEC 29119 parts 1, 2, 3, 4 and 5.
4.1
Backus-Naur Form
formal metalanguage used for defining the syntax of a formal language
4.2
base choice
see base value
4.3
base value
input parameter value used in ‘base choice testing’ that is normally selected based on being a representative
or typical value for the parameter. Also called base choice
4.4
c-use
see computation data use
4.5
computation data use
where the value of a variable is read in any statement other than a conditional expression. Also called c-use
4.6
condition
Boolean expression containing no Boolean operators
EXAMPLE “A < B” is a condition but “A and B” is not.
[SOURCE: BS 7925-1:1998, 3.45, modified added quotation marks to example and removed “a” from start
of definition]
4.7
control flow
abstract representation of all possible sequences of events in a test item’s execution
2 © ISO/IEC 2013 — All rights reserved
ISO/IEC DIS 29119-4
[SOURCE: BS 7925-1:1998, 3.50, modified replaced “program’s” with “test item’s” and removed “an” from
start of definition]
4.8
control flow sub-path
sequence of executable statements within a test item
4.9
data definition
see variable definition
4.10
data definition c-use pair
data definition and computation data use, where the d
...
ISO/IEC/
INTERNATIONAL
IEEE
STANDARD
29119-4
First edition
2015-12-01
Software and systems engineering —
Software testing —
Part 4:
Test techniques
Ingénierie du logiciel et des systèmes — Essais du logiciel —
Partie 4: Techniques d’essai
Reference number
©
ISO/IEC 2015
©
IEEE 2015
© ISO/IEC 2015
© IEEE 2015
All rights reserved. Unless otherwise specified, no part of this publication may be reproduced or utilized in any form or by any means,
electronic or mechanical, including photocopying and microfilm, without permission in writing from ISO, IEC or IEEE at the respective
address below.
ISO copyright office IEC Central Office Institute of Electrical and Electronics Engineers, Inc. 3
Ch. de Blandonnet 8 · CP 401 3, rue de Varembé Park Avenue, New York
CH-1214 Vernier, Geneva, Switzerland CH-1211 Geneva 20 NY 10016-5997, USA
Tel. + 41 22 749 01 11 Switzerland E-mail stds.ipr@ieee.org
Fax + 41 22 749 09 47 E-mail inmail@iec.ch Web www.ieee.org
copyright@iso.org Web www.iec.ch
www.iso.org
Published in Switzerland
© ISO/IEC 2015 – All rights reserved
ii © IEEE 2015 – All rights reserved
Contents Page
Foreword .v
Introduction .vi
1 Scope . 1
2 Conformance . 1
2.1 Intended Usage . 1
2.2 Full Conformance . 1
2.3 Tailored Conformance . 1
3 Normative References . 1
4 Terms and Definitions . 2
5 Test Design Techniques . 4
5.1 Overview . 4
5.2 Specification-Based Test Design Techniques . 7
5.2.1 Equivalence Partitioning . 7
5.2.2 Classification Tree Method . 8
5.2.3 Boundary Value Analysis . 9
5.2.4 Syntax Testing .11
5.2.5 Combinatorial Test Design Techniques .12
5.2.6 Decision Table Testing.15
5.2.7 Cause-Effect Graphing .15
5.2.8 State Transition Testing .16
5.2.9 Scenario Testing .17
5.2.10 Random Testing .18
5.3 Structure-Based Test Design Techniques .18
5.3.1 Statement Testing .18
5.3.2 Branch Testing .19
5.3.3 Decision Testing . .20
5.3.4 Branch Condition Testing .20
5.3.5 Branch Condition Combination Testing .21
5.3.6 Modified Condition Decision Coverage (MCDC) Testing .21
5.3.7 Data Flow Testing .22
5.4 Experience-Based Test Design Techniques .25
5.4.1 Error Guessing .25
6 Test Coverage Measurement .25
6.1 Overview .25
6.2 Test Measurement for Specification-Based Test Design Techniques .26
6.2.1 Equivalence Partition Coverage .26
6.2.2 Classification Tree Method Coverage .26
6.2.3 Boundary Value Analysis Coverage .26
6.2.4 Syntax Testing Coverage .26
6.2.5 Combinatorial Test Design Technique Coverage .27
6.2.6 Decision Table Testing Coverage .27
6.2.7 Cause-Effect Graphing Coverage .28
6.2.8 State Transition Testing Coverage .28
6.2.9 Scenario Testing Coverage .28
6.2.10 Random Testing Coverage .28
6.3 Test Measurement for Structure-Based Test Design Techniques .29
6.3.1 Statement Testing Coverage .29
6.3.2 Branch Testing Coverage .29
6.3.3 Decision Testing Coverage .29
6.3.4 Branch Condition Testing Coverage .29
6.3.5 Branch Condition Combination Testing Coverage .29
6.3.6 Modified Condition Decision (MCDC) Testing Coverage .30
© ISO/IEC 2015 – All rights reserved iii
© IEEE 2015 – All rights reserved
6.3.7 Data Flow Testing Coverage .30
6.4 Test Measurement for Experience-Based Testing Design Techniques.31
6.4.1 Error Guessing Coverage .31
Annex A (informative) Testing Quality Characteristics .32
Annex B (informative) Guidelines and Examples for the Application of Specification-Based
Test Design Techniques .43
Annex C (informative) Guidelines and Examples for the Application of Structure-Based
Test Design Techniques .103
Annex D (informative) Guidelines and Examples for the Application of Experience-Based
Test Design Techniques .126
Annex E (informative) Guidelines and Examples for the Application of Interchangeable
Test Design Techniques .129
Annex F (informative) Test Design Technique Coverage Effectiveness .133
Annex G (informative) ISO/IEC/IEEE 29119-4 and BS 7925-2 Test Design Technique Alignment 135
Bibliography .137
iv © ISO/IEC 2015 – All rights reserved
© IEEE 2015 – All rights reserved
Foreword
ISO (the International Organization for Standardization) and IEC (the International Electrotechnical
Commission) form the specialized system for worldwide standardization. National bodies that are
members of ISO or IEC participate in the development of International Standards through technical
committees established by the respective organization to deal with particular fields of technical
activity. ISO and IEC technical committees collaborate in fields of mutual interest. Other international
organizations, governmental and non-governmental, in liaison with ISO and IEC, also take part in the
work. In the field of information technology, ISO and IEC have established a joint technical committee,
ISO/IEC JTC 1.
The procedures used to develop this document and those intended for its further maintenance are
described in the ISO/IEC Directives, Part 1. In particular the different approval criteria needed for
the different types of document should be noted. This document was drafted in accordance with the
editorial rules of the ISO/IEC Directives, Part 2 (see www.iso.org/directives).
Attention is drawn to the possibility that some of the elements of this document may be the subject
of patent rights. ISO and IEC shall not be held responsible for identifying any or all such patent
rights. Details of any patent rights identified during the development of the document will be in the
Introduction and/or on the ISO list of patent declarations received (see www.iso.org/patents).
Any trade name used in this document is information given for the convenience of users and does not
constitute an endorsement.
For an explanation on the meaning of ISO specific terms and expressions related to conformity
assessment, as well as information about ISO’s adherence to the WTO principles in the Technical
Barriers to Trade (TBT) see the following URL: Foreword - Supplementary information
The committee responsible for this document is ISO/IEC JTC 1, Information technology, SC 7, Software
and Systems Engineering.
ISO/IEC/IEEE 29119 consists of the following standards, under the general title Software and Systems
Engineering — Software Testing:
— Part 1: Concepts and definitions
— Part 2: Test processes
— Part 3: Test documentation
— Part 4: Test techniques
The following parts are under preparation:
— Part 5: Keyword-driven testing
© ISO/IEC 2015 – All rights reserved v
© IEEE 2015 – All rights reserved
Introduction
The purpose of this part of ISO/IEC/IEEE 29119 is to provide an International Standard that defines
software test design techniques (also known as test case design techniques or test methods) that can be
used within the test design and implementation process that is defined in ISO/IEC/IEEE 29119-2. This
part of ISO/IEC/IEEE 29119 does not prescribe a process for test design and implementation; instead, it
describes a set of techniques that can be used within ISO/IEC/IEEE 29119-2. The intent is to describe a
series of techniques that have wide acceptance in the software testing industry.
The test design techniques presented in this part of ISO/IEC/IEEE 29119 can be used to derive test
cases that, when executed, generate evidence that test item requirements have been met and/or that
defects are present in a test item (i.e. that requirements have not been met). Risk-based testing could be
used to determine the set of techniques that are applicable in specific situations (risk-based testing is
covered in ISO/IEC/IEEE 29119-1 and ISO/IEC/IEEE 29119-2).
NOTE A “test item” is a work product that is being tested (see ISO/IEC/IEEE 29119-1).
EXAMPLE 1 “Test items” include systems, software items, objects, classes, requirements documents, design
specifications, and user guides.
Each technique follows the test design and implementation process that is defined in ISO/IEC/
IEEE 29119-2 and shown in Figure 1.
Of the activities in this process, ISO/IEC/IEEE 29119-4 provides guidance on how to implement the
following activities in detail for each technique that is described:
— Derive Test Conditions (TD2);
— Derive Test Coverage Items (TD3);
— Derive Test Cases (TD4).
A test condition is a testable aspect of a test item, such as a function, transaction, feature, quality attribute,
or structural element identified as a basis for testing. This determination can be achieved by agreeing
with stakeholders which attributes are to be tested or by applying one or more test design techniques.
EXAMPLE 2 If a test completion criterion for state transition testing was identified that required coverage of
all states then the test conditions could be the states the test item can be in. Other examples of test conditions are
equivalence classes and boundaries between them.
Test coverage items are attributes of each test condition that can be covered during testing. A single
test condition may be the basis for one or more test coverage items.
EXAMPLE 3 If a specific boundary is identified as a test condition, then the corresponding test coverage items
could be the boundary itself and immediately either side of the boundary.
A test case is a set of preconditions, inputs (including actions, where applicable), and expected results,
developed to determine whether or not the covered part of the test item has been implemented correctly.
Specific (normative) guidance on how to implement the other activities in the test design &
implementation process of ISO/IEC/IEEE 29119-2, including activities TD1 (Identify Feature Sets), TD5
(Assemble Test Sets), and TD6 (Derive Test Procedures), is not included in Clauses 5 or 6 of this part of
ISO/IEC/IEEE 29119 because the process is the same for all techniques.
vi © ISO/IEC 2015 – All rights reserved
© IEEE 2015 – All rights reserved
Figure 1 — ISO/IEC/IEEE 29119-2 Test Design and Implementation Process
ISO/IEC/TR 19759 (SWEBOK) defines two types of requirements: functional requirements and quality
requirements. ISO/IEC 25010 defines eight quality characteristics (including functionality) that can be
used to identify types of testing that may be applicable for testing a specific test item. Annex A provides
example mappings of test design techniques that apply to testing quality characteristics defined in
ISO/IEC 25010.
Experience-based testing practices like exploratory testing and other test practices such as model-based
testing are not defined in this part of ISO/IEC/IEEE 29119 because this part of ISO/IEC/IEEE 29119 only
describes techniques for designing test cases. Test practices such as exploratory testing are described
in ISO/IEC/IEEE 29119-1.
Templates and examples of test documentation that are produced during the testing process are defined
in ISO/IEC/IEEE 29119-3 Test Documentation. The test techniques in this part of ISO/IEC/IEEE 29119
do not describe how test cases should be documented (e.g. they do not include information or guidance
on assigning unique identifiers, test case descriptions, priorities, traceability, or pre-conditions).
Information on how to document test cases can be found in ISO/IEC/IEEE 29119-3.
This part of ISO/IEC/IEEE 29119 aims to provide stakeholders with the ability to design test cases for
the testing of software in any organization.
© ISO/IEC 2015 – All rights reserved vii
© IEEE 2015 – All rights reserved
INTERNATIONAL STANDARD ISO/IEC/IEEE 29119-4:2015(E)
Software and systems engineering — Software testing —
Part 4:
Test techniques
1 Scope
This part of ISO/IEC/IEEE 29119 defines test design techniques that can be used during the test design
and implementation process that is defined in ISO/IEC/IEEE 29119-2.
This part of ISO/IEC/IEEE 29119 is intended for, but not limited to, testers, test managers, and
developers, particularly those responsible for managing and implementing software testing.
2 Conformance
2.1 Intended Usage
The normative requirements in this part of ISO/IEC/IEEE 29119 are contained in Clauses 5 and 6.
It is recognised that particular projects or organizations may not need to use all of the techniques
defined by this standard. Therefore, implementation of this standard typically involves selecting
a set of techniques suitable for the project or organization. There are two ways that an organization
or individual can claim conformance to the provisions of this standard – full conformance or tailored
conformance. The organization or individual shall assert whether full or tailored conformance to this
standard is claimed.
2.2 Full Conformance
Full conformance is achieved by demonstrating that all of the requirements (i.e. ‘shall’ statements) of the
chosen (non-empty) set of techniques in Clause 5 and/or the corresponding test coverage measurement
approaches in Clause 6 have been satisfied.
EXAMPLE An organization could choose to conform only to one technique, such as boundary value analysis.
In this scenario, the organization would only be required to provide evidence that they have met the requirements
of that one technique in order to claim conformance to this part of ISO/IEC/IEEE 29119.
2.3 Tailored Conformance
Tailored conformance is achieved by demonstrating that the chosen subset of requirements from the
chosen (non-empty) set of techniques and/or corresponding test coverage measurement approaches
have been satisfied. Where tailoring occurs, justification shall be provided (either directly or by
reference) whenever the normative requirements of a technique defined in Clause 5 or measure
defined in Clause 6 are not followed completely. All tailoring decisions shall be recorded with their
rationale, including the consideration of any applicable risks. Tailoring shall be agreed by the relevant
stakeholders.
3 Normative References
The following documents, in whole or in part, are normatively referenced in this document and are
indispensable for its application. For dated references, only the edition cited applies. For undated
references, the latest edition of the referenced document (including any amendments) applies.
© ISO/IEC 2015 – All rights reserved 1
© IEEE 2015 – All rights reserved
ISO/IEC/IEEE 29119-1, Software and systems engineering — Software testing — Part 1: Concepts and
definitions
ISO/IEC/IEEE 29119-2, Software and systems engineering — Software testing — Part 2: Test processes
ISO/IEC/IEEE 29119-3, Software and systems engineering — Software testing — Part 3: Test
documentation
NOTE Other International Standards useful for the implementation and interpretation of this part of
ISO/IEC/IEEE 29119 are listed in the bibliography.
4 Terms and Definitions
For the purposes of this document, the terms and definitions given in ISO/IEC/IEEE 24765 and the
following apply.
NOTE Use of the terminology in this part of ISO/IEC/IEEE 29119 is for ease of reference and is not
mandatory for conformance with the standard. The following terms and definitions are provided to assist with
the understanding and readability of this part of ISO/IEC/IEEE 29119. Only terms critical to the understanding
of this part of ISO/IEC/IEEE 29119 are included. This clause is not intended to provide a complete list of testing
terms. The systems and software engineering vocabulary ISO/IEC/IEEE 24765 can be referenced for terms not
defined in this clause.
4.1
Backus-Naur Form
formal meta-language used for defining the syntax of a language in a textual format
4.2
base choice
see base value
4.3
base value
input parameter value used in ‘base choice testing’ that is normally selected based on being a
representative or typical value for the parameter. Also called base choice
4.4
c-use
see computation data use
4.5
computation data use
use of the value of a variable in any type of statement
4.6
condition
Boolean expression containing no Boolean operators
EXAMPLE “A < B” is a condition but “A and B” is not.
4.7
control flow
sequence in which operations are performed during the execution of a test item
4.8
control flow sub-path
sequence of executable statements within a test item
4.9
data definition
statement where a variable is assigned a value. Also called variable definition
2 © ISO/IEC 2015 – All rights reserved
© IEEE 2015 – All rights reserved
4.10
data definition c-use pair
data definition and subsequent computation data use, where the data use uses the value defined in the
data definition
4.11
data definition p-use pair
data definition and subsequent predicate data use, where the data use uses the value defined in the
data definition
4.12
data definition-use pair
data definition and subsequent data use, where the data use uses the value defined in the data definition
4.13
data use
executable statement where the value of a variable is accessed
4.14
decision outcome
result of a decision (which therefore determines the control flow alternative taken)
4.15
decision rule
combination of conditions (also known as causes) and actions (also known as effects) that produce a
specific outcome in decision table testing and cause-effect graphing
4.16
definition-use pair
data definition and subsequent predicate or computational data use, where the data use uses the value
defined in the data definition
4.17
definition-use path
control flow sub-path from a variable definition to a predicate-use (p-use) or computational-use (c-use)
of that variable
4.18
entry point
point in a test item at which execution of the test item can begin
Note 1 to entry: An entry point is an executable statement within a test item that may be selected by an external
process as the starting point for one or more paths through the test item. It is most commonly the first executable
statement within the test item.
4.19
executable statement
statement which, when compiled, is translated into object code, which will be executed procedurally
when the test item is running and may perform an action on program data
4.20
exit point
last executable statement within a test item
Note 1 to entry: An exit point is a terminal point of a path through a test item, being an executable statement
within the test item which either terminates the test item, or returns control to an external process. This is most
commonly the last executable statement within the test item.
4.21
p-use
see predicate data use
© ISO/IEC 2015 – All rights reserved 3
© IEEE 2015 – All rights reserved
4.22
P-V pair
combination of a test item parameter with a value assigned to that parameter, used as a test condition
and coverage item in combinatorial test design techniques
4.23
path
sequence of executable statements of a test item
4.24
predicate
logical expression which evaluates to TRUE or FALSE, normally to direct the execution path in code
4.25
predicate data use
data use associated with the decision outcome of the predicate portion of a decision statement
4.26
sub-path
path that is part of a larger path
4.27
test model
representation of a test item that is used during the test case design process
4.28
variable definition
see data definition
5 Test Design Techniques
5.1 Overview
ISO/IEC/IEEE 29119-4 defines test design techniques for specification-based testing (5.2), structure-
based testing (5.3) and experience-based testing (5.4). In specification-based testing, the test basis
(e.g. requirements, specifications, models or user needs) is used as the main source of information
to design test cases. In structure-based testing, the structure of the test item (e.g. source code or the
structure of a model) is used as the primary source of information to design test cases. In experience-
based testing, the knowledge and experience of the tester is used as the primary source of information
during test case design. For specification-based testing, structure-based testing and experience-based
testing, the test basis is used to generate the expected results. These classes of test design techniques
are complementary and their combined application typically results in more effective testing.
Although the techniques presented in ISO/IEC/IEEE 29119-4 are classified as structure-based,
specification-based or experience-based, in practice some of them can be used interchangeably (e.g. branch
testing could be used to design test cases for testing logical paths through the graphical user interface
of an Internet-based system). This is demonstrated in Annex E. In addition, although each technique is
defined independently of all others, in practice they can be used in combination with other techniques.
EXAMPLE The test coverage items derived by applying equivalence partitioning could be used to populate
input parameters of test cases derived using scenario testing.
ISO/IEC/IEEE 29119-4 uses the terms specification-based testing and structure-based testing; these
categories of techniques are also known as “black-box testing” and “white-box testing” (or “clear-box
testing”) respectively. The terms “black-box” and “white-box” refer to the visibility of the internal
structure of the test item. In black-box testing the internal structure of the test item is not visible (hence
the black box), whereas for white-box testing the internal structure of the test item is visible. When a
technique is applied while utilising a combination of knowledge from the test item’s specification and
structure, this is often called “grey-box testing”.
4 © ISO/IEC 2015 – All rights reserved
© IEEE 2015 – All rights reserved
ISO/IEC/IEEE 29119-4 defines how the generic test design and implementation process steps TD2
(derive test conditions), TD3 (derive test coverage items), and TD4 (derive test cases) from ISO/IEC/
IEEE 29119-2 (see Introduction) shall be used by each technique. It does not provide context-specific
definitions of the techniques that describe how each technique should be used in all situations. Users of
ISO/IEC/IEEE 29119-4 may refer to Annex B, Annex C, Annex D and Annex E for detailed examples that
demonstrate how to apply the techniques.
The techniques that are defined in ISO/IEC/IEEE 29119-4 are shown in Figure 2. This set of techniques
is not exhaustive. There are techniques that are used by testing practitioners or researchers that are
not included in ISO/IEC/IEEE 29119-4.
© ISO/IEC 2015 – All rights reserved 5
© IEEE 2015 – All rights reserved
Test Design Techniques Presented
in ISO/IEC/IEEE 29119-4
Speciication-Based Structure-Based Experience-Based
Techniques Techniques Techniques
(clause 5.2) (clause 5.3) (clause 5.4)
Equivalence Partitioning Statement Testing Error Guessing
(clause 5.2.1) (clause 5.3.1) (clause 5.4.1)
Classiication Tree Method Branch Testing
(clause 5.2.2) (clause 5.3.2)
Boundary Value Analysis Decision Testing
(clause 5.2.3) (clause 5.3.3)
Syntax Testing Branch Condition Testing
(clause 5.2.4) (clause 5.3.4)
Combinatorial Branch Condition
Test Design Techniques Combination Testing
(clause 5.2.5) (clause 5.3.5)
Modiied Condition Decision
All Combinations Testing
Coverage Testing
(clause 5.2.5.3)
(clause 5.3.6)
Pair-Wise Testing Data Flow Testing
(clause 5.2.5.4) (clause 5.3.7)
Each Choice Testing All-Deinitions Testing
(clause 5.2.5.5) (clause 5.3.7.2)
Base Choice Testing All-C-Uses Testing
(clause 5.2.5.6) (clause 5.3.7.3)
Decision Table Testing All-P-Uses Testing
(clause 5.2.6) (clause 5.3.7.4)
Cause-Effect Graphing All-Uses Testing
(clause 5.2.7) (clause 5.3.7.5)
State Transition Testing All-DU-Paths Testing
(clause 5.2.8) (clause 5.3.7.6)
Scenario Testing
(clause 5.2.9)
Figure 2 — The set of test design techniques presented in ISO/IEC/IEEE 29119-4
Of the six activities in the test design and implementation process (see Figure 1), test design techniques
provide unique and specific guidance on the derivation of test conditions (TD2), test coverage items
(TD3) and test cases (TD4). Therefore, each technique is defined in terms of these three activities.
There are varying levels of granularity within steps TD2 (derive test conditions), TD3 (derive test
coverage items) and TD4 (derive test cases). Within each technique, the term “model” is used to
6 © ISO/IEC 2015 – All rights reserved
© IEEE 2015 – All rights reserved
describe the concept of preparing a logical representation of the test item for the purposes of deriving
test conditions in step TD2 (e.g. a control flow model is required for deriving test conditions for all
structural techniques). Some situations may require the entire model to be a test condition, whereas in
other situations, one part of the model may be a test condition.
EXAMPLE 1 In state transition testing, if there is a requirement to cover all states then the entire state model
could be the test condition. Alternatively, if there is a requirement to cover specific transitions between states,
then each transition could be a test condition.
In addition, since some techniques share underlying concepts, their definitions contain similar text.
EXAMPLE 2 Both equivalence partitioning and boundary value analysis are based on equivalence classes.
In the test case design step (TD4) of each technique, test cases that are created may be “valid” (i.e. they
contain input values that the test item should accept as correct) or “invalid” (i.e. they contain at least
one input value that the test item should reject as incorrect, ideally with an appropriate error message).
In some techniques, such as equivalence partitioning and boundary value analysis, invalid test cases
are usually derived using the “one-to-one” approach as it avoids fault masking by ensuring that each
test case only includes one invalid input value, while valid test cases are typically derived using the
“minimized” approach, as this reduces the number of test cases required to cover valid test coverage
items (see 5.2.1.3 and 5.2.3.3).
NOTE Invalid cases are also known as “negative test cases”.
Although the techniques defined in ISO/IEC/IEEE 29119-4 are each described in a separate clause (as if
they were mutually exclusive), in practice they could be applied in a blended way.
EXAMPLE 3 Boundary value analysis could be used to select test input values, after which pair-wise testing
could be used to design test cases from the test input values. Equivalence partitioning could be used to select
the classifications and classes for the classification tree method and then each choice testing could be used to
construct test cases from the classes.
The techniques presented in ISO/IEC/IEEE 29119-4 could also be used in conjunction with the test
types that are presented in Annex A. For example, equivalence partitioning could be used to identify
user groups (test conditions) and representative users (test coverage items) from those groups in test
cases that are to be tested during usability testing.
The normative definitions of the techniques are provided in Clause 5. The corresponding normative
coverage measures for each technique are presented in Clause 6. This is supported by informative
examples of each technique in Annexes B, C, D and E. Although the examples of each technique
demonstrate manual application of the technique, in practice, automation can be used to support some
types of test design and execution (e.g. statement coverage analyzers can be used to support structure-
based testing). Annex A provides examples of how the test design techniques defined in this standard
can be applied to testing the quality characteristics that are defined in ISO/IEC 25010.
5.2 Specification-Based Test Design Techniques
5.2.1 Equivalence Partitioning
5.2.1.1 Derive Test Conditions (TD2)
Equivalence partitioning (BS 7925-2:1998; Myers 1979) uses a model of the test item that partitions the
inputs and outputs of the test item into equivalence partitions (also called “partitions” or “equivalence
classes”), where each equivalence partition shall be defined as a test condition. These equivalence
partitions shall be derived from the test basis, where each partition is chosen such that all values
within the equivalence partition can reasonably be expected to be treated similarly (i.e. they may be
© ISO/IEC 2015 – All rights reserved 7
© IEEE 2015 – All rights reserved
considered “equivalent”) by the test item. Equivalence partitions may be derived for both valid and
invalid inputs and outputs.
EXAMPLE For a test item expecting lowercase alphabetical characters as (valid) inputs, invalid input
equivalence partitions that could be derived include equivalence partitions containing integers, reals, uppercase
alphabetical characters, symbols and control characters, depending on the level of rigour required during testing.
NOTE 1 For output equivalence partitions, corresponding input partitions are derived based on the processing
described in the test item’s specification. Test inputs are then selected from the input partitions.
NOTE 2 Invalid output equivalence partitions typically correspond to any outputs that have not been
explicitly specified. As these are not specified their identification often results in equivalence partitions based
on the subjectivity of the individual tester. This subjective form of test design may also occur when applying
experience-based techniques like error guessing.
NOTE 3 Domain analysis (Beizer 1995) is often classified as a combination of equivalence partitioning and
boundary value analysis.
5.2.1.2 Derive Test Coverage Items (TD3)
Each equivalence partition shall be identified as a test coverage item (i.e. for equivalence partitioning
the test conditions and test coverage items are the same equivalence partitions).
5.2.1.3 Derive Test Cases (TD4)
Test cases shall be derived to exercise the test coverage items (i.e. the equivalence partitions). The
following steps shall be used during test case derivation:
a) Decide on an approach for selecting combinations of test coverage items to be exercised by test
cases, where two common approaches are (BS 7925-2:1998; Myers 1979):
1) one-to-one, in which each test case is derived to cover a specific equivalence partition;
2) minimized, in which equivalence partitions are covered by test cases such that the minimum
number of test cases derived covers all equivalence partitions at least once.
NOTE Other approaches to selecting combinations of test coverage items to be exercised by test cases
are described in 5.2.5 (Combinatorial Test Design Techniques).
b) Select test coverage item(s) for inclusion in the current test case based on the approach chosen in
step a);
c) Identify input values to exercise the test coverage items to be covered by the test case and arbitrary
valid values for any other input variables required by the test case;
d) Determine the expected result of the test case by applying the input(s) to the test basis;
e) Repeat steps b) to d) until the required level of test coverage is achieved.
5.2.2 Classification Tree Method
5.2.2.1 Derive Test Conditions (TD2)
The classification tree method (Grochtmann and Grimm 1993) uses a model of the test item that
partitions the inputs of the test item and represents them graphically in the form of a tree called
a classification tree. The test item’s inputs are partitioned into “classifications”, where each
classification consists of a disjoint (non-overlapping) set of “classes” and often sub-classes, and the
set of classifications is complete (all classifications of all inputs relevant to the test item domain being
modelled have been identified and included). Each classification shall be a test condition. “Classes” that
result from decomposing the classifications may be partitioned further into “sub-classes” depending
on the level of rigour required in the testing. Classifications and classes may be derived for both valid
8 © ISO/IEC 2015 – All rights reserved
© IEEE 2015 – All rights reserved
and invalid input data, depending on the level of test coverage required. The hierarchical relationships
between classifications, classes and sub-classes are modelled as a tree, in which the input domain of the
test item is placed as the root node, the classifications as branch nodes, and the classes or sub-classes
as leaf nodes.
NOTE The process of partitioning in the classification tree method is similar to equivalence partitioning.
The key difference is that in the classification tree method, the partitions (which are classifications and classes)
must be completely disjoint, whereas in equivalence partitioning, they could overlap depending on how the
technique was applied. In addition, the classification tree method also includes the design of a classification tree,
which provides a visual representation of the test conditions.
5.2.2.2 Derive Test Coverage Items (TD3)
Test coverage items shall be derived by combining classes using a chosen combination approach.
EXAMPLE Two example approaches for combining classes into test coverage items are:
— minimized, in which classes are included in test coverage items such that the minimum number of test
coverage items are derived to cover all classes at least once;
— maximized, in which classes are included in test coverage items such that each possible combination of
classes is covered by at least one test coverage item.
NOTE 1 Other approaches to selecting combinations of test coverage items are described in 5.2.5
(Combinatorial Test Design Techniques).
NOTE 2 The test coverage items are often illustrated in a combination table (see Figure B.5 in B.2.2.5).
NOTE 3 The original publication of the classification tree method (Grochtmann and Grimm 1993) used the
terms “minimal” and “maximal” instead of “minimized” and “maximized”.
5.2.2.3 Derive Test Cases (TD4)
Test cases shall be derived to exercise the test coverage items. The following steps shall be followed
during test case derivation:
a) Based on the combinations of classes created in step TD3, select one combination for inclusion in
the current test case that has not already been covered by a test case;
b) Identify input values for any classes that do not already have an assigned value;
c) Determine the expected result of the test case by applying the input(s) to the test basis;
d) Repeat steps a) to c) until the required level of test coverage is achieved.
5.2.3 Boundary Value Analysis
5.2.3.1 Derive Test Conditions (TD2)
Boundary value analysis (BS 7925-2:1998; Myers 1979) uses a model of the test item that partitions
the inputs and outputs of the test item into a number of ordered sets and subsets (partitions and sub-
partitions) with identifiable boundaries, where each boundary is a test condition. The boundaries shall
be derived from the test basis.
EXAMPLE For a partition defined as integers from 1 to 10 inclusive, there are two boundaries, where the
lower boundary is 1 and the upper boundary is 10, and these are the test conditions.
NOTE For output boundaries, corresponding input partitions are derived based on the processing described
in the test item’s specification. Test inputs are then selected from the input partitions.
© ISO/IEC 2015 – All rights reserved 9
© IEEE 2015 – All rights reserved
...










Questions, Comments and Discussion
Ask us and Technical Secretary will try to provide an answer. You can facilitate discussion about the standard in here.
Loading comments...