Information technology — Biometric presentation attack detection — Part 3: Testing and reporting

This document establishes: — principles and methods for the performance assessment of presentation attack detection (PAD) mechanisms; — reporting of testing results from evaluations of PAD mechanisms; and — a classification of known attack types (Annex A). Outside the scope are: — standardization of specific PAD mechanisms; — detailed information about countermeasures (i.e. anti-spoofing techniques), algorithms or sensors; and — overall system-level security or vulnerability assessment. The attacks considered in this document take place at the biometric capture device during presentation. Any other attacks are considered outside the scope of this document.

Technologies de l'information — Détection d'attaque de présentation en biométrie — Partie 3: Essais et rapports d'essai

General Information

Status
Published
Publication Date
09-Jan-2023
Current Stage
6060 - International Standard published
Due Date
19-Sep-2023
Completion Date
10-Jan-2023
Ref Project

Relations

Buy Standard

Standard
ISO/IEC 30107-3:2023 - Information technology — Biometric presentation attack detection — Part 3: Testing and reporting Released:10. 01. 2023
English language
39 pages
sale 15% off
Preview
sale 15% off
Preview

Standards Content (Sample)

INTERNATIONAL ISO/IEC
STANDARD 30107-3
Second edition
2023-01
Information technology — Biometric
presentation attack detection —
Part 3:
Testing and reporting
Technologies de l'information — Détection d'attaque de présentation
en biométrie —
Partie 3: Essais et rapports d'essai
Reference number
ISO/IEC 30107-3:2023(E)
© ISO/IEC 2023
---------------------- Page: 1 ----------------------
ISO/IEC 30107-3:2023(E)
COPYRIGHT PROTECTED DOCUMENT
© ISO/IEC 2023

All rights reserved. Unless otherwise specified, or required in the context of its implementation, no part of this publication may

be reproduced or utilized otherwise in any form or by any means, electronic or mechanical, including photocopying, or posting on

the internet or an intranet, without prior written permission. Permission can be requested from either ISO at the address below

or ISO’s member body in the country of the requester.
ISO copyright office
CP 401 • Ch. de Blandonnet 8
CH-1214 Vernier, Geneva
Phone: +41 22 749 01 11
Email: copyright@iso.org
Website: www.iso.org
Published in Switzerland
© ISO/IEC 2023 – All rights reserved
---------------------- Page: 2 ----------------------
ISO/IEC 30107-3:2023(E)
Contents Page

Foreword ..........................................................................................................................................................................................................................................v

Introduction .............................................................................................................................................................................................................................. vi

1 Scope ................................................................................................................................................................................................................................. 1

2 Normative references ..................................................................................................................................................................................... 1

3 Terms and definitions .................................................................................................................................................................................... 2

3.1 Attack elements ..................................................................................................................................................................................... 2

3.2 Metrics ........................................................................................................................................................................................................... 3

3.3 Test roles ...................................................................................................................................................................................................... 5

4 Abbreviated terms ............................................................................................................................................................................................. 6

5 Conformance ............................................................................................................................................................................................................ 7

6 Presentation attack detection (PAD) overview .................................................................................................................. 7

7 Levels of evaluation of PAD mechanisms ................................................................................................................................... 8

7.1 Overview ...................................................................................................................................................................................................... 8

7.2 General principles of evaluation of PAD mechanisms ......................................................................................... 9

7.3 PAD subsystem evaluation........................................................................................................................................................... 9

7.4 Data capture subsystem evaluation ................................................................................................................................. 10

7.5 Full system evaluation .................................................................................................................................................................. 10

8 Artefact properties .........................................................................................................................................................................................11

8.1 Properties of PAIs in biometric impostor attacks ................................................................................................ 11

8.2 Properties of PAIs in biometric concealer attacks ...............................................................................................12

8.3 Properties of synthesized biometric samples with abnormal characteristics ..........................12

9 Considerations in non-conformant capture attempts of biometric characteristics ................13

9.1 Methods of presentation .............................................................................................................................................................13

9.2 Methods of assessment ................................................................................................................................................................13

10 Artefact creation and usage in evaluations of PAD mechanisms ..................................................................13

10.1 General ........................................................................................................................................................................................................13

10.2 Artefact creation and preparation ..................................................................................................................................... 13

10.3 Artefact usage ...................................................................................................................................................................................... 14

10.4 Iterative testing to identity effective artefacts ...................................................................................................... 15

11 Process-dependent evaluation factors ......................................................................................................................................15

11.1 Overview ................................................................................................................................................................................................... 15

11.2 Evaluating the enrolment process ..................................................................................................................................... 15

11.3 Evaluating the verification process .................................................................................................................................. 16

11.4 Evaluating the identification process ............................................................................................................................. 16

11.5 Evaluating offline PAD mechanisms ................................................................................................................................ 17

12 Evaluation using Common Criteria framework ...............................................................................................................17

12.1 General ........................................................................................................................................................................................................ 17

12.2 Common Criteria and biometrics ........................................................................................................................................ 18

12.2.1 Overview ................................................................................................................................................................................ 18

12.2.2 General evaluation aspects .................................................................................................................................... 19

12.2.3 Error rates in testing ................................................................................................................................................... 19

12.2.4 PAD evaluation .................................................................................................................................................................. 19

12.2.5 Vulnerability assessment ......................................................................................................................................... 20

13 Metrics for the evaluation of biometric systems with PAD mechanisms .............................................21

13.1 General ........................................................................................................................................................................................................ 21

13.2 Metrics for PAD subsystem evaluation .......................................................................................................................... 22

13.2.1 General .....................................................................................................................................................................................22

13.2.2 Classification metrics ..................................................................................................................................................22

13.2.3 Non-response metrics ................................................................................................................................................. 25

iii
© ISO/IEC 2023 – All rights reserved
---------------------- Page: 3 ----------------------
ISO/IEC 30107-3:2023(E)

13.2.4 Efficiency metrics ...........................................................................................................................................................25

13.2.5 Summary ................................................................................................................................................................................ 25

13.3 Metrics for data capture subsystem evaluation .................................................................................................... 25

13.3.1 General ..................................................................................................................................................................................... 25

13.3.2 Acquisition metrics .......................................................................................................................................................26

13.3.3 Non-response metrics ................................................................................................................................................. 26

13.3.4 Efficiency metrics ...........................................................................................................................................................26

13.3.5 Summary ................................................................................................................................................................................ 26

13.4 Metrics for full system evaluation ..................................................................................................................................... 27

13.4.1 General ..................................................................................................................................................................................... 27

13.4.2 Accuracy metrics............................................................................................................................................................. 27

13.4.3 Efficiency metrics ........................................................................................................................................................... 27

13.4.4 Generalized full-system evaluation performance .............................................................................28

13.4.5 Summary ................................................................................................................................................................................ 30

Annex A (informative) Classification of attack types .....................................................................................................................32

Annex B (informative) Examples of artefact species used in a PAD subsystem evaluation

for fingerprint capture devices .........................................................................................................................................................36

Annex C (informative) Roles in PAD testing ..............................................................................................................................................37

Bibliography .............................................................................................................................................................................................................................38

© ISO/IEC 2023 – All rights reserved
---------------------- Page: 4 ----------------------
ISO/IEC 30107-3:2023(E)
Foreword

ISO (the International Organization for Standardization) and IEC (the International Electrotechnical

Commission) form the specialized system for worldwide standardization. National bodies that are

members of ISO or IEC participate in the development of International Standards through technical

committees established by the respective organization to deal with particular fields of technical

activity. ISO and IEC technical committees collaborate in fields of mutual interest. Other international

organizations, governmental and non-governmental, in liaison with ISO and IEC, also take part in the

work.

The procedures used to develop this document and those intended for its further maintenance

are described in the ISO/IEC Directives, Part 1. In particular, the different approval criteria

needed for the different types of document should be noted. This document was drafted in

accordance with the editorial rules of the ISO/IEC Directives, Part 2 (see www.iso.org/directives or

www.iec.ch/members_experts/refdocs).

Attention is drawn to the possibility that some of the elements of this document may be the subject

of patent rights. ISO and IEC shall not be held responsible for identifying any or all such patent

rights. Details of any patent rights identified during the development of the document will be in the

Introduction and/or on the ISO list of patent declarations received (see www.iso.org/patents) or the IEC

list of patent declarations received (see https://patents.iec.ch).

Any trade name used in this document is information given for the convenience of users and does not

constitute an endorsement.

For an explanation of the voluntary nature of standards, the meaning of ISO specific terms and

expressions related to conformity assessment, as well as information about ISO's adherence to

the World Trade Organization (WTO) principles in the Technical Barriers to Trade (TBT) see

www.iso.org/iso/foreword.html. In the IEC, see www.iec.ch/understanding-standards.

This document was prepared by Joint Technical Committee ISO/IEC JTC 1, Information technology,

Subcommittee SC 37, Biometrics.

This second edition cancels and replaces the first edition (ISO/IEC 30107-3:2017), which has been

technically revised.
The main changes are as follows:
— the relative impostor attack presentation accept rate has been added (13.4.4);

— information on roles in presentation attack detection testing have been added (Annex C);

— general technical clarifications and improvements have been made.

A list of all parts in the ISO/IEC 30107 series can be found on the ISO and IEC websites.

Any feedback or questions on this document should be directed to the user’s national standards

body. A complete listing of these bodies can be found at www.iso.org/members.html and

www.iec.ch/national-committees.
© ISO/IEC 2023 – All rights reserved
---------------------- Page: 5 ----------------------
ISO/IEC 30107-3:2023(E)
Introduction

The presentation of an artefact or of human characteristics to a biometric capture subsystem in a fashion

intended to interfere with system policy is referred to as a presentation attack. The ISO/IEC 30107

series deals with techniques for the automated detection of presentation attacks. These techniques are

called presentation attack detection (PAD) mechanisms.

As is the case for biometric recognition, PAD mechanisms are subject to false positive and false negative

errors. False positive errors wrongly categorize bona fide presentations as attack presentations,

potentially flagging or inconveniencing legitimate users. False negative errors wrongly categorize

presentation attacks (also known as attack presentations) as bona fide presentations, potentially

resulting in a security breach.

Therefore, the decision to use a specific implementation of PAD will depend on the requirements of

the application and consideration of the trade-offs with respect to security, evidence strength and

efficiency.
The purpose of this document is as follows:
— to define terms related to biometric PAD testing and reporting, and

— to specify principles and methods of performance assessment of biometric PAD, including metrics.

This document is directed at vendors or test laboratories seeking to conduct evaluations of PAD

mechanisms.

Biometric performance testing terminology, practices and methodologies for statistical analysis have

been standardized through ISO and Common Criteria. False accept rate (FAR), false reject rate (FRR)

and failure to enrol rate (FTE) are widely used to characterize biometric system performance. Biometric

performance testing terminology, practices and methodologies for statistical analysis are only partially

applicable to the evaluation of PAD mechanisms due to significant fundamental differences between

biometric performance testing concepts and PAD mechanism testing concepts. These differences can

be categorized as follows.
a) Statistical significance

Biometric performance testing utilizes a statistically significant number of test subjects, representative

of the targeted user group. Error rates are not expected to vary significantly when adding more test

subjects or using a completely different group.

In PAD testing, many biometric modalities can be attacked by a large or indeterminate number of

potential presentation attack instrument species (PAIS). In these cases, it is very difficult or even

impossible to have a comprehensive model of all possible presentation attack instruments (PAIs). Hence,

it could be impossible to find a representative set of PAIS for the evaluation. Therefore, measured error

rates of one set of PAIs cannot be assumed to be applicable to a different set.

PAIS present a source of systematic variation in a test. Different PAIs can have significantly different

error rates. Additionally, within any given PAIS, there is random variation across instances of the PAI

series. The number of presentations required for a statistically significant test scales linearly with the

number of PAIS of interest. Within each PAIS, the uncertainty associated with a PAD error rate estimate

depends on the number of artefacts tested and the number of individuals.

EXAMPLE 1 In fingerprint biometrics, many potent artefact materials are known, but any material or material

mixture that can present fingerprint features to a biometric capture device is a possible candidate. Since artefact

properties such as age, thickness, moisture, temperature, mixture rates and manufacturing practices can have

a significant influence on the output of the PAD mechanism, it is easy to define tens of thousands of PAIS using

current materials. Hundreds of thousands of presentations would be needed for a proper statistical analysis, and

even then, resulting error rates cannot be transferred to the next set of new materials.

© ISO/IEC 2023 – All rights reserved
---------------------- Page: 6 ----------------------
ISO/IEC 30107-3:2023(E)

PAI presentation can also be source of variation in a test. Variation in pressure, position or even PAI

presenter characteristics can impact PAD performance.
b) Comparability of test results across systems

In biometric performance testing, application-specific error rates based on the same corpus of biometric

samples can be used to compare different biometric systems or different configurations. Results can be

used to unambiguously compare and assess system performance. By contrast, when using error rates to

benchmark PAD mechanisms, interpreting results can be highly dependent on the intended application.

EXAMPLE 2 In a given testing scenario with 10 PAIS (presented 100 times), System detects 90 % of attack

presentations and System detects 85 %. System detects all presentations for 9 PAIS but fails to detect all

2 1

presentations with the 10th PAIS. System detects 85 % of all PAIS. Which is better? In a security analysis

System would be worse than System , because revealing the 10th PAIS would orient an attacker such that they

1 2

could use this method to defeat the capture device all the time. However, if attackers could be prevented from

using the 10th PAIS, System would be better than System , because individual rates indicate that it is possible to

1 2
overcome System with all PAIS.
c) Cooperation

Many biometric performance tests address applications such as access control in which subjects

are cooperative. Errors due to incorrect operation are an issue of a lack of knowledge, experience

or guidance rather than intent. Significant uncooperative behaviour in a group is not part of the

underlying “biometric model” and would render the determined error rates almost useless for biometric

performance testing.

PAD tests include subjects whose behaviour is not cooperative. Attackers will try to find and exploit any

weakness of the biometric system, circumventing or manipulating its intended operation. Presentation

attack types, based on the experience and knowledge of the tester, can change the success rates for an

attack dramatically. Hence it can be difficult to define testing procedures that measure error rates in a

fashion representative of cooperative behaviour.
d) Automated testing

In biometric performance testing, it is often possible to test comparison algorithms using databases

from devices or sensors of similar quality. Performance can be measured in a technology evaluation

using previously collected corpora of samples as specified in ISO/IEC 19795-1.

In PAD testing, data from the biometric capture device (e.g. digitized fingerprint images) can in some

cases be insufficient to conduct evaluations. Biometric systems with PAD mechanisms often contain

additional sensors to detect specific properties of a biometric characteristic. Hence, a database

previously collected for a specific biometric system or configuration is not necessarily suitable for

another biometric system or configuration.

Even slight changes in the hardware or software could make earlier measurements useless. It is

generally impractical to store multivariate synchronized PAD signals and replay them in automated

testing. Therefore, automated testing is often not an option for testing and evaluating PAD mechanisms.

e) Quality and performance

In biometric performance testing, performance is usually linked directly to biometric data quality. Low-

quality samples generally result in higher error rates while a test with only high-quality samples will

generally result in lower error rates. Quality metrics are therefore often used to improve performance

(dependent on the application).

In PAD testing, even though low biometric quality can cause an artefact to be unsuccessful, there is no

reason to assume a certain quality level from artefacts in general. Samples from artefacts can exhibit

better quality than samples from human biometric characteristics. Without a model of attacker skill,

it seems valid (at least in a security evaluation) to assume a “worst case” scenario where the attacker

always uses the best possible quality. That way, one can at least determine a guaranteed minimal

detection rate for the specific test set while reducing the number of necessary tests at the same time.

vii
© ISO/IEC 2023 – All rights reserved
---------------------- Page: 7 ----------------------
ISO/IEC 30107-3:2023(E)

It is then a matter of rating the attack potential of successful artefacts (effort and expertise for the

needed quality) in order to assess the security level, as is the practice in Common Criteria evaluations.

Based on the differences in a) through e), the following general comments regarding error rates and

metrics related to PAD mechanisms can be derived.
— In an evaluation, PAIS are analysed/rated separately.

— Attack presentation classification error rates other than 0 % for a PAIS only prove that the PAI

can be successful. A different tester can potentially achieve a higher or lower attack presentation

classification error rate. Further, training to identify the relevant material and presentation

parameters could increase the attack presentation classification error rate for this PAIS. The

experience and knowledge of the tester, as well as the availability of the necessary resources, are

significant factors in PAD testing and are taken into account when conducting comparisons or

performance analysis.

Error rates for PAD mechanisms are determined by the specific context of the given PAD mechanism,

the set of PAIS, the application, the test approach, and the tester. Error rates for PAD mechanisms are not

necessarily comparable across similar tests, and error rates for PAD mechanisms are not necessarily

reproducible by different test laboratories.
viii
© ISO/IEC 2023 – All rights reserved
---------------------- Page: 8 ----------------------
INTERNATIONAL STANDARD ISO/IEC 30107-3:2023(E)
Information technology — Biometric presentation attack
detection —
Part 3:
Testing and reporting
1 Scope
This document establishes:

— principles and methods for the performance assessment of presentation attack detection (PAD)

mechanisms;
— reporting of testing results from evaluations of PAD mechanisms; and
— a classification of known attack types (Annex A).
Outside the scope are:
— standardization of specific PAD mechanisms;

— detailed information about countermeasures (i.e. anti-spoofing techniques), algorithms or sensors;

and
— overall system-level security or vulnerability assessment.

The attacks considered in this document take place at the biometric capture device during presentation.

Any other attacks are considered outside the scope of this document.
2 Normative references

The following documents are referred to in the text in such a way that some or all of their content

constitutes requirements of this document. For dated references, only the edition cited applies. For

undated references, the latest edition of the referenced document (including any amendments) applies.

ISO/IEC 2382-37, Information technology — Vocabulary — Part 37: Biometrics

ISO/IEC 15408-1, Information security, cybersecurity and privacy protection — Evaluation criteria for IT

security — Part 1: Introduction and general model

ISO/IEC 15408-2, Information security, cybersecurity and privacy protection — Evaluation criteria for IT

security — Part 2: Security functional components

ISO/IEC 15408-3, Information security, cybersecurity and privacy protection — Evaluation criteria for IT

security — Part 3: Security assurance components

ISO/IEC 19795-1, Information technology — Biometric performance testing and reporting — Part 1:

Principles and framework

ISO/IEC 30107-1, Information technology — Biometric presentation attack detection — Part 1: Framework

© ISO/IEC 2023 – All rights reserved
---------------------- Page: 9 ----------------------
ISO/IEC 30107-3:2023(E)
3 Terms and definitions

For the purposes of this document, the terms and definitions given in ISO/IEC 19795-1, ISO/IEC 2382-37,

ISO/IEC 30107-1 and the following apply.

ISO and IEC maintain terminology databases for use in standardization at the following addresses:

— ISO Online browsing platform: available at https:// www .iso .org/ obp
— IEC Electropedia: available at https:// www .electropedia .org/
3.1 Attack elements
3.1.1
presentation attack
attack presentation

presentation to the biometric data capture subsystem with the goal of interfering with the operation of

the biometric system

Note 1 to entry: An attack presentation can be a single attempt, a multi-attempt transaction, or another type of

interaction with a subsystem.
3.1.2
bona fide presentation

interaction of the biometric test subject and the biometric data capture subsystem in the fashion

intended by the policy of the biometric system

Note 1 to entry: Bona fide is analogous to normal or routine, when referring to a bona fide presentation.

Note 2 to entry: Bona fide presentations can include those in which the user has a low level of training or

skill. Bon
...

Questions, Comments and Discussion

Ask us and Technical Secretary will try to provide an answer. You can facilitate discussion about the standard in here.