Space - Use of GNSS-based positioning for road Intelligent Transport Systems (ITS) - Field tests definition for basic performance

The purpose is to define the tests to be performed in order to evaluate the performances of road applications’ GNSS-based positioning terminal (GBPT). To fully define the tests, this task will address the test strategy, the facilities to be used, the test scenarios (e.g. environments and characteristics, which shall allow the comparison of different tests), and the test procedures. The defined tests and process will be validated by performing various in-field tests. The defined tests focus essentially on accuracy, integrity and availability as required in the statement of work included in the invitation to tender.
This document will benefit to:
- The consolidation of EN 16803-1: "Definitions and system engineering procedures for the establishment and assessment of performances"
- The elaboration of EN 16803-2: "Assessment of basic performances of GNSS-based positioning terminals"
- The elaboration of EN 16803-3: "Assessment of security performances of GNSS based positioning terminals".

Definition von Feldtests für Grundleistungen

Espace - Utilisation de la localisation basée sur les GNSS pour les systèmes de transport routiers intelligents - Définition des essais terrains pour les performances générales

Vesolje - Ugotavljanje položaja z uporabo sistema globalne satelitske navigacije (GNSS) pri inteligentnih transportnih sistemih (ITS) v cestnem prometu - Opredelitev terenskih preskusov za osnovno zmogljivost

General Information

Status
Published
Public Enquiry End Date
15-Jan-2020
Publication Date
14-May-2020
Technical Committee
Current Stage
6060 - National Implementation/Publication (Adopted Project)
Start Date
12-May-2020
Due Date
17-Jul-2020
Completion Date
15-May-2020

Buy Standard

Technical report
SIST-TP CEN/TR 17465:2020 - BARVE
English language
144 pages
sale 10% off
Preview
sale 10% off
Preview

e-Library read for
1 day
Technical report
kSIST-TP FprCEN/TR 17465:2020 - BARVE
English language
137 pages
sale 10% off
Preview
sale 10% off
Preview

e-Library read for
1 day

Standards Content (sample)

SLOVENSKI STANDARD
SIST-TP CEN/TR 17465:2020
01-julij-2020
Vesolje - Ugotavljanje položaja z uporabo sistema globalne satelitske navigacije
(GNSS) pri inteligentnih transportnih sistemih (ITS) v cestnem prometu -
Opredelitev terenskih preskusov za osnovno zmogljivost

Space - Use of GNSS-based positioning for road Intelligent Transport Systems (ITS) -

Field tests definition for basic performance
Definition von Feldtests für Grundleistungen
Espace - Utilisation de la localisation basée sur les GNSS pour les systèmes de

transport routiers intelligents - Définition des essais terrains pour les performances

générales
Ta slovenski standard je istoveten z: CEN/TR 17465:2020
ICS:
33.070.40 Satelit Satellite
35.240.60 Uporabniške rešitve IT v IT applications in transport
prometu
SIST-TP CEN/TR 17465:2020 en,fr,de

2003-01.Slovenski inštitut za standardizacijo. Razmnoževanje celote ali delov tega standarda ni dovoljeno.

---------------------- Page: 1 ----------------------
SIST-TP CEN/TR 17465:2020
---------------------- Page: 2 ----------------------
SIST-TP CEN/TR 17465:2020
TECHNICAL REPORT
CEN/TR 17465
RAPPORT TECHNIQUE
TECHNISCHER BERICHT
April 2020
ICS 03.220.20; 33.060.30; 35.240.60
English version
Space - Use of GNSS-based positioning for road Intelligent
Transport Systems (ITS) - Field tests definition for basic
performance

Espace - Utilisation de la localisation basée sur les Definition von Feldtests für Grundleistungen

GNSS pour les systèmes de transport routiers
intelligents - Définition des essais terrains pour les
performances générales

This Technical Report was approved by CEN on 23 February 2020. It has been drawn up by the Technical Committee

CEN/CLC/JTC 5.

CEN and CENELEC members are the national standards bodies and national electrotechnical committees of Austria, Belgium,

Bulgaria, Croatia, Cyprus, Czech Republic, Denmark, Estonia, Finland, France, Germany, Greece, Hungary, Iceland, Ireland, Italy,

Latvia, Lithuania, Luxembourg, Malta, Netherlands, Norway, Poland, Portugal, Republic of North Macedonia, Romania, Serbia,

Slovakia, Slovenia, Spain, Sweden, Switzerland, Turkey and United Kingdom.
CEN-CENELEC Management Centre:
Rue de la Science 23, B-1040 Brussels

© 2020 CEN/CENELEC All rights of exploitation in any form and by any means Ref. No. CEN/TR 17465:2020 E

reserved worldwide for CEN national Members and for
CENELEC Members.
---------------------- Page: 3 ----------------------
SIST-TP CEN/TR 17465:2020
CEN/TR 17465:2020 (E)
Contents Page

European foreword .................................................................................................................................................... 6

1 Scope ....................................................................................................................................................................... 7

2 Normative references ....................................................................................................................................... 8

3 Terms and definitions ....................................................................................................................................... 8

4 List of acronyms .................................................................................................................................................. 8

5 Definition of the general strategy: what kind of tests? ......................................................................... 9

5.1 General........................................................................................................................................................... 9

5.2 GBPT characterization .................................................................................................................................... 9

5.2.1 An hybrid and heterogenic system ................................................................................................................... 9

5.2.2 Test combinatory explosion: an issue ............................................................................................................. 10

5.2.3 Proposed approach ......................................................................................................................................... 11

5.3 Stakeholders and responsibilities ................................................................................................................. 13

5.3.1 Industry value chain ........................................................................................................................................ 13

5.3.2 Roles and responsibilities ............................................................................................................................... 14

5.4 Main criteria for testing strategy .................................................................................................................. 16

5.5 Potential test methods ................................................................................................................................. 17

5.5.1 General ........................................................................................................................................................... 17

5.5.2 Simulations ..................................................................................................................................................... 17

5.5.3 Field test ......................................................................................................................................................... 17

5.5.4 Record and Replay .......................................................................................................................................... 18

5.5.5 Verification methods face-off table with regard to main criteria for testing strategy ................................... 19

5.6 Metrics coverage .......................................................................................................................................... 21

5.6.1 Need of refinement of the metrics data sets .................................................................................................. 21

5.6.2 Unique data collection for the accuracy, availability, integrity metrics ......................................................... 21

5.6.3 Same data collection for a flexible list of road applications ........................................................................... 22

5.6.4 Particular case of the integrity risk ................................................................................................................. 22

5.6.5 Particular case of the TTFF assessment .......................................................................................................... 24

5.6.6 Fit the test methodology to the metrics purposes ......................................................................................... 24

5.7 Recommendation for testing strategy .......................................................................................................... 26

5.7.1 General ........................................................................................................................................................... 26

5.7.2 Proposal for a homologation plan (case of complex hybridized GBPT systems considered) ......................... 27

6 Definition of the operational scenario: how to configure the tests? ............................................. 30

6.1 General......................................................................................................................................................... 30

---------------------- Page: 4 ----------------------
SIST-TP CEN/TR 17465:2020
CEN/TR 17465:2020 (E)

6.2 Preamble ...................................................................................................................................................... 30

6.3 Status of definition of an operational scenario: discussions ......................................................................... 32

6.3.1 General ............................................................................................................................................................ 32

6.3.2 Set-up conditions ............................................................................................................................................. 32

6.3.3 Trajectory/motion ........................................................................................................................................... 34

6.3.4 Environmental conditions ................................................................................................................................ 35

6.3.5 Synthesis on the construction of the operational scenario ............................................................................. 37

6.4 Descriptions of operational scenarios expected for record and replay testing .............................................. 38

6.4.1 General ............................................................................................................................................................ 38

6.4.2 Set-up conditions ............................................................................................................................................. 38

6.4.3 Selection of roads ............................................................................................................................................ 40

6.4.4 Selection of kinds of trips ................................................................................................................................ 41

6.4.5 Crossing selected roads and trips: proposed organization of tests ................................................................. 42

6.5 Operational scenarios for TTFF field tests ..................................................................................................... 46

6.5.1 General ............................................................................................................................................................ 46

6.5.2 Set-up conditions ............................................................................................................................................. 47

6.5.3 Trajectories ...................................................................................................................................................... 47

6.5.4 Environmental conditions ................................................................................................................................ 48

6.5.5 Proposed combinations ................................................................................................................................... 48

7 Definition of the metrics and related tools: what to measure? .......................................................49

7.1 General ......................................................................................................................................................... 49

7.2 Accuracy metrics ........................................................................................................................................... 50

7.2.1 General ............................................................................................................................................................ 50

7.2.2 Integrity metrics .............................................................................................................................................. 60

7.3 Availability metrics ....................................................................................................................................... 74

7.4 Continuity metrics ........................................................................................................................................ 75

7.5 Timing metrics .............................................................................................................................................. 76

7.6 Synthesis on the receiver outputs to be collected ......................................................................................... 80

7.7 Synthesis of the metrics computation tool functions .................................................................................... 80

8 Definition of the test facilities: which equipment to use? .................................................................81

8.1 General ......................................................................................................................................................... 81

8.2 Equipment panorama and characterization .................................................................................................. 81

8.2.1 Equipment for in-field data collection ............................................................................................................. 81

8.2.2 Laboratory Test-beds ....................................................................................................................................... 86

8.2.3 Log and Replay Solutions ................................................................................................................................. 91

8.3 Equipment Justification ................................................................................................................................ 95

8.3.1 Equipment for in-field data collection ............................................................................................................. 95

8.3.2 “Log & Replay” Solutions ................................................................................................................................. 98

9 Definition of the test procedures: how to proceed to the tests?................................................... 100

---------------------- Page: 5 ----------------------
SIST-TP CEN/TR 17465:2020
CEN/TR 17465:2020 (E)

9.1 General....................................................................................................................................................... 100

9.2 Field tests for recording the in-file data of the standardized operational scenario ..................................... 101

9.2.1 General ......................................................................................................................................................... 101

9.2.2 Test plan........................................................................................................................................................ 101

9.2.3 Good functioning verification ....................................................................................................................... 104

9.2.4 Field test conducting ..................................................................................................................................... 105

9.2.5 Data analysis and archiving ........................................................................................................................... 107

9.3 Characterization of environment ................................................................................................................ 108

9.4 Replay step: assessing the DUT performances ............................................................................................ 109

10 Definition of the validation procedures: how to be sure of the results? ............................... 110

10.1 General....................................................................................................................................................... 110

10.2 Presentation of a scenario: rush time in Toulouse ...................................................................................... 110

10.3 Quality of the reference trajectory ............................................................................................................. 112

10.4 Availability, regularity of the of the DUT’s outputs for the metrics computations ...................................... 113

10.5 Statistic representability of the results ....................................................................................................... 114

11 Definition of the synthesis report: how to report the results of the tests? .......................... 118

11.1 General....................................................................................................................................................... 118

11.2 Identification of the DUT ............................................................................................................................ 118

11.3 Identifications of test ................................................................................................................................. 118

11.4 Personal responsible of tests ...................................................................................................................... 119

11.5 Identification of the tests stimuli ................................................................................................................ 119

11.6 Report of tests conditions .......................................................................................................................... 119

11.7 Identification of the files including the test raw data ................................................................................. 119

11.8 Identification of tools used in post processing for computing the metrics .................................................. 119

11.9 Results ........................................................................................................................................................ 119

11.10 Date of report, responsible and contact, lab address and signatures ..................................................... 120

Annex A (normative) ETSI test definition for GBLS .................................................................................. 121

A.1 Synthetic reporting of the ETSI specification for the operational environment and tests

conditions ................................................................................................................................................................. 121

Annex B (normative) Detailed criteria for testing strategy ................................................................... 133

---------------------- Page: 6 ----------------------
SIST-TP CEN/TR 17465:2020
CEN/TR 17465:2020 (E)

B.1 First criteria: trust in metrology ......................................................................................................... 133

B.1.1 Reproducible results ........................................................................................................................... 133

B.1.2 Representative and meaningful results for the road applications ..................................... 135

B.1.3 Reliable procedure for the metric assessment .......................................................................... 135

B.2 Second criteria: Reasonable cost for manufacturers ................................................................... 135

B.2.1 Cost of test benches .............................................................................................................................. 135

B.2.2 Cost of the test operations ................................................................................................................. 136

B.2.3 Additional costs ..................................................................................................................................... 136

B.3 Third criteria: clear responsibility and sharing between actors ............................................ 137

Annex C (normative) Size of data collection (GMV contribution) ....................................................... 139

C.1 Data campaign definition: sample size ............................................................................................. 139

C.2 Statistical significance ............................................................................................................................. 140

Bibliography ............................................................................................................................................................ 144

---------------------- Page: 7 ----------------------
SIST-TP CEN/TR 17465:2020
CEN/TR 17465:2020 (E)
European foreword

This document (CEN/TR 17465:2020) has been prepared by Technical Committee CEN/TC 5 “Space”,

the secretariat of which is held by DIN.

Attention is drawn to the possibility that some of the elements of this document may be the subject of

patent rights. CEN shall not be held responsible for identifying any or all such patent rights.

---------------------- Page: 8 ----------------------
SIST-TP CEN/TR 17465:2020
CEN/TR 17465:2020 (E)
1 Scope

This document is the output of WP1.2 “Field test definition for basic performances” of the GP-START

project.

The GP-START project aims to prepare the draft standards CEN/CENELEC/TC5 16803-2 and 16803-3

for the Use of GNSS-based positioning for road Intelligent Transport Systems (ITS). Part 2: Assessment of

basic performances of GNSS-based positioning terminals is the specific target of this document.

This document constitutes the part of the Technical Report on Metrics and Performance levels detailed

definition and field test definition for basic performances regarding the field tests definition.

The purpose of WP1.2 is to define the field tests to be performed in order to evaluate the

performances of road applications’ GNSS-based positioning terminal (GBPT). To fully define the tests,

this task addresses the test strategy, the facilities to be used, the test scenarios (e.g. environments and

characteristics, which should allow the comparison of different tests), and the test procedures. The

defined tests and process will be validated by performing various in-field tests. The defined tests focus

essentially on accuracy, integrity and availability as required in the statement of work included in the

invitation to tender.
This document will serve to:

• the consolidation of EN 16803-1: Definitions and system engineering procedures for the

establishment and assessment of performances;

• the elaboration of EN 16803-2: Assessment of basic performances of GNSS-based positioning

terminals;

• the elaboration of EN 16803-3: Assessment of security performances of GNSS-based positioning

terminals.
The document is structured as follows:
• Clause 1 is the present Scope;
• Clause 5 defines and justifies the global strategy for testing;
• Clause 6 defines and justifies the retained operational scenario;
• Clause 7 defines the metrics and related tools;
• Clause 8 defines the required tests facilities;
• Clause 9 defines the tests procedures;
• Clause 10 defines the validation procedures;
• Clause 11 defines how to report the tests results.
---------------------- Page: 9 ----------------------
SIST-TP CEN/TR 17465:2020
CEN/TR 17465:2020 (E)
2 Normative references

The following documents are referred to in the text in such a way that some or all of their content

constitutes requirements of this document. For dated references, only the edition cited applies. For

undated references, the latest edition of the referenced document (including any amendments)

applies.

EN 16803-1:2016, Space — Use of GNSS-based positioning for road Intelligent Transport Systems (ITS)

— Part 1: Definitions and system engineering procedures for the establishment and assessment of

performances
3 Terms and definitions
No terms and definitions are listed in this document.

ISO and IEC maintain terminological databases for use in standardization at the following addresses:

• ISO Online browsing platform: available at http://www.iso.org/obp
• IEC Electropedia: available at http://www.electropedia.org/
4 List of acronyms
GNSS Global navigation satellite system
GPS Global positioning system
▪ SBAS Satellite based augmentation system
▪ COTS Commercial on the shelves
▪ GBPT GNSS based positioning terminal
▪ OTS On the shelves
ITS Intelligent transport systems
ETSI European telecommunications standards institute
A-GNSS Assisted GNSS
FAR False alarm rate
PFA Probability of false alarm
PMD Probability of miss detection
▪ PPK Post processing kinematic
▪ AIA Accuracy, integrity, availability
▪ SW Software
▪ LoS Line of Sight
---------------------- Page: 10 ----------------------
SIST-TP CEN/TR 17465:2020
CEN/TR 17465:2020 (E)
5 Definition of the general strategy: what kind of tests?
5.1 General

The technical solutions for ITS (road environment), focused in the targeted standard, are more and

more complex.

One consequence is that their performances and behaviours will no more only depend on their design

but also, and strongly, depend on a lot of external situations and parameters, uncontrolled by the

stakeholders. Among those parameters, we can quote the dependencies on the status of international

worldwide space systems (GNSS), on physical atmospheric conditions, and other environmental

conditions in the proximity of the vehicle (traffic, tree foliage, buildings in vicinity etc.).

As an example, this situation implies that any realization of one field test procedure of a given product

at a given date and hour, will give a different result than the same test procedure of the same product

in the same location at a different date and hour (neither ergodic nor stationary stochastic process).

The obvious consequence is that, if a pure field test strategy is targeted as a preferred solution for the

performance assessment aiming homologation of devices, the analysis of the tests results would

require specialists, and may frequently result in intangible and unreliable interpretations, the opposite

of metrology.

A solution to avoid this issue is to have a total trust in simulations where all the tests conditions are

controlled and which could be perfectly repeatable. ETSI addressed a similar issue during its

standardization process targeting the GNSS based Location Based Services (See ETSI TS 103 246-1, −2,

−3, −4, −5). As a conclusion of its work, ETSI, selected a solution exclusively based on simulations (see

Annex A).

Considering that the real-life environment remains complex to be simulated, the pure simulation

technique will lead to scenarios with a very great number of parameters to be set-up, inducing risk of

human manipulation errors, and anyway a remaining lack of representation of the reality.

New paradigms have to be seriously considered, and this Clause 5 aims to open solutions by analysing

the best way to select and phase the tests to be performed in a standardized performance assessment.

5.2 GBPT characterization
5.2.1 An hybrid and heterogenic system

According to Figure 3 of (see EN 16803-1), Positioning-based road ITS system is the integration of the

GBPT into the road ITS application. Moreover, GBPT is presented also as a complex assembly of

sensors, with multiple interfaces with external systems.

The positioning level, focused in this part of document for the definition of tests, is still an assembly of

more or less complex components where at least one (1) component is a GNSS sensor.

The generic architecture of a Road ITS system ((EN 16803-1), Figure 4) shows directly that the

evaluation of the metrics related to the positioning (accuracy, integrity, availability) will be complex,

since it:

• covers intermediate outputs (position, speed) of a global integrated system, likely not easy to

capture in some future finally packaged and installed products: specific prescriptions and

communication protocols should be standardized;

• depends on worldwide and independently evolving infrastructures, namely GNSS infrastructure and

telecommunication networks, interacting each other’s (Assisted, Differential GNSS) and with the

system itself, and in particular influencing strongly its performances;
---------------------- Page: 11 ----------------------
SIST-TP CEN/TR 17465:2020
CEN/TR 17465:2020 (E)

• covers sensing of the external environment and consequently depends on a huge number of

external environmental conditions (radio propagation for GNSS and telecommunications, light, fog

and dust for cam and LIDAR, etc.);

• covers in the same time sensing of the motion of the vehicle through odometers and inertial

sensors and consequently depends on the vehicle and its driving as well as additional external

environmental conditions (ex: meteorological, or road-holding for odometer).

In EN 16803-1:2016, Clause 8 presents a long (even if not exhaustive) list of parameters which should

impact the definition of the tests.

Synthesizing together, namely in an integrated lab test facility, the effects of worldwide radio

infrastructures (as existing currently or evolving in the future), their local radio propagation in road

environment, the motion sensing by inertial sensors, and others phenomena like climatic for odometer

or imaging for computer vision is today unfeasible.

Today, the lab facilities for making tests in the environmental conditions interesting each sensor exist

separately but are never integrated.
Figure 1 — Typical test bench in laboratories facilities

Left to right: GNSS signal generation (ESA radio navigation lab), radio oriented for radio navigat

...

SLOVENSKI STANDARD
kSIST-TP FprCEN/TR 17465:2020
01-januar-2020
Vesolje - Ugotavljanje položaja z uporabo sistema globalne satelitske navigacije
(GNSS) pri inteligentnih transportnih sistemih (ITS) v cestnem prometu -
Opredelitev terenskih preskusov za osnovno zmogljivost

Space - Use of GNSS-based positioning for road Intelligent Transport Systems (ITS) -

Field tests definition for basic performance
Definition von Feldtests für Grundleistungen
Espace - Utilisation de la localisation basée sur les GNSS pour les systèmes de

transport routiers intelligents - Définition des essais terrains pour les performances

générales
Ta slovenski standard je istoveten z: FprCEN/TR 17465
ICS:
33.070.40 Satelit Satellite
35.240.60 Uporabniške rešitve IT v IT applications in transport
prometu
kSIST-TP FprCEN/TR 17465:2020 en,fr,de

2003-01.Slovenski inštitut za standardizacijo. Razmnoževanje celote ali delov tega standarda ni dovoljeno.

---------------------- Page: 1 ----------------------
kSIST-TP FprCEN/TR 17465:2020
---------------------- Page: 2 ----------------------
kSIST-TP FprCEN/TR 17465:2020
TECHNICAL REPORT
FINAL DRAFT
FprCEN/TR 17465
RAPPORT TECHNIQUE
TECHNISCHER BERICHT
September 2019
ICS
English version
Space - Use of GNSS-based positioning for road Intelligent
Transport Systems (ITS) - Field tests definition for basic
performance

Espace - Utilisation de la localisation basée sur les Definition von Feldtests für Grundleistungen

GNSS pour les systèmes de transport routiers
intelligents - Définition des essais terrains pour les
performances générales

This draft Technical Report is submitted to CEN members for Vote. It has been drawn up by the Technical Committee

CEN/CLC/JTC 5.

CEN and CENELEC members are the national standards bodies and national electrotechnical committees of Austria, Belgium,

Bulgaria, Croatia, Cyprus, Czech Republic, Denmark, Estonia, Finland, France, Germany, Greece, Hungary, Iceland, Ireland, Italy,

Latvia, Lithuania, Luxembourg, Malta, Netherlands, Norway, Poland, Portugal, Republic of North Macedonia, Romania, Serbia,

Slovakia, Slovenia, Spain, Sweden, Switzerland, Turkey and United Kingdom.

Recipients of this draft are invited to submit, with their comments, notification of any relevant patent rights of which they are

aware and to provide supporting documentation.

Warning : This document is not a Technical Report. It is distributed for review and comments. It is subject to change without

notice and shall not be referred to as a Technical Report.
CEN-CENELEC Management Centre:
Rue de la Science 23, B-1040 Brussels

© 2019 CEN/CENELEC All rights of exploitation in any form and by any means Ref. No. FprCEN/TR 17465:2019 E

reserved worldwide for CEN national Members and for
CENELEC Members.
---------------------- Page: 3 ----------------------
kSIST-TP FprCEN/TR 17465:2020
FprCEN/TR 17465:2019 (E)
Contents Page

European foreword .................................................................................................................................................... 6

1 Scope ....................................................................................................................................................................... 7

2 Normative references ....................................................................................................................................... 8

3 Terms and definitions ....................................................................................................................................... 8

4 List of acronyms .................................................................................................................................................. 8

5 Definition of the general strategy: what kind of tests? ......................................................................... 9

5.1 General........................................................................................................................................................... 9

5.2 GBPT characterization .................................................................................................................................... 9

5.2.1 An hybrid and heterogenic system ................................................................................................................... 9

5.2.2 Test combinatory explosion: an issue ............................................................................................................. 11

5.2.3 Proposed approach ......................................................................................................................................... 12

5.3 Stakeholders and responsibilities ................................................................................................................. 14

5.3.1 Industry value chain ........................................................................................................................................ 14

5.3.2 Roles and responsibilities ............................................................................................................................... 14

5.4 Main criteria for testing strategy .................................................................................................................. 16

5.5 Potential test methods ................................................................................................................................. 17

5.5.1 General ........................................................................................................................................................... 17

5.5.2 Simulations ..................................................................................................................................................... 17

5.5.3 Field test ......................................................................................................................................................... 17

5.5.4 Record and Replay .......................................................................................................................................... 18

5.5.5 Verification methods face-off table with regard to main criteria for testing strategy ................................... 19

5.6 Metrics coverage .......................................................................................................................................... 21

5.6.1 Need of refinement of the metrics data sets .................................................................................................. 21

5.6.2 Unique data collection for the accuracy, availability, integrity metrics ......................................................... 21

5.6.3 Same data collection for a flexible list of road applications ........................................................................... 22

5.6.4 Particular case of the integrity risk ................................................................................................................. 22

5.6.5 Particular case of the TTFF assessment .......................................................................................................... 24

5.6.6 Fit the test methodology to the metrics purposes ......................................................................................... 24

5.7 Recommendation for testing strategy .......................................................................................................... 26

5.7.1 General ........................................................................................................................................................... 26

5.7.2 Proposal for a homologation plan (case of complex hybridized GBPT systems considered) ......................... 27

6 Definition of the operational scenario: how to configure the tests? ............................................. 30

6.1 General......................................................................................................................................................... 30

---------------------- Page: 4 ----------------------
kSIST-TP FprCEN/TR 17465:2020
FprCEN/TR 17465:2019 (E)

6.2 Preamble ...................................................................................................................................................... 30

6.3 Status of definition of an operational scenario: discussions ......................................................................... 32

6.3.1 General ............................................................................................................................................................ 32

6.3.2 Set-up conditions ............................................................................................................................................. 32

6.3.3 Trajectory/motion ........................................................................................................................................... 33

6.3.4 Environmental conditions ................................................................................................................................ 35

6.3.5 Synthesis on the construction of the operational scenario ............................................................................. 37

6.4 Descriptions of operational scenarios expected for record and replay testing .............................................. 38

6.4.1 General ............................................................................................................................................................ 38

6.4.2 Set-up conditions ............................................................................................................................................. 38

6.4.3 Selection of roads ............................................................................................................................................ 40

6.4.4 Selection of kinds of trips ................................................................................................................................ 43

6.4.5 Crossing selected roads and trips: proposed organization of tests ................................................................. 45

6.5 Operational scenarios for TTFF field tests ..................................................................................................... 48

6.5.1 General ............................................................................................................................................................ 48

6.5.2 Set-up conditions ............................................................................................................................................. 48

6.5.3 Trajectories ...................................................................................................................................................... 49

6.5.4 Environmental conditions ................................................................................................................................ 50

6.5.5 Proposed combinations ................................................................................................................................... 50

7 Definition of the metrics and related tools: what to measure? .......................................................51

7.1 General ......................................................................................................................................................... 51

7.2 Accuracy metrics ........................................................................................................................................... 51

7.2.1 General ............................................................................................................................................................ 51

7.2.2 Integrity metrics .............................................................................................................................................. 61

7.3 Availability metrics ....................................................................................................................................... 66

7.4 Continuity metrics ........................................................................................................................................ 67

7.5 Timing metrics .............................................................................................................................................. 67

7.6 Synthesis on the receiver outputs to be collected ......................................................................................... 70

7.7 Synthesis of the metrics computation tool functions .................................................................................... 70

8 Definition of the test facilities: which equipment to use? .................................................................71

8.1 General ......................................................................................................................................................... 71

8.2 Equipment panorama and characterization .................................................................................................. 71

8.2.1 Equipment for in-field data collection ............................................................................................................. 71

8.2.2 Laboratory Test-beds ....................................................................................................................................... 77

8.2.3 Log and Replay Solutions ................................................................................................................................. 82

8.3 Equipment Justification ................................................................................................................................ 87

8.3.1 Equipment for in-field data collection ............................................................................................................. 87

8.3.2 “Log & Replay” Solutions ................................................................................................................................. 91

9 Definition of the test procedures: how to proceed to the tests?......................................................93

---------------------- Page: 5 ----------------------
kSIST-TP FprCEN/TR 17465:2020
FprCEN/TR 17465:2019 (E)

9.1 General......................................................................................................................................................... 93

9.2 Field tests for recording the in-file data of the standardized operational scenario ....................................... 94

9.2.1 General ........................................................................................................................................................... 94

9.2.2 Test plan.......................................................................................................................................................... 94

9.2.3 Good functioning verification ......................................................................................................................... 98

9.2.4 Field test conducting ....................................................................................................................................... 98

9.2.5 Data analysis and archiving ............................................................................................................................. 99

9.3 Characterization of environment ................................................................................................................ 101

9.4 Replay step: assessing the DUT performances ............................................................................................ 101

10 Definition of the validation procedures: how to be sure of the results? ............................... 102

10.1 General....................................................................................................................................................... 102

10.2 Presentation of a scenario: rush time in Toulouse ...................................................................................... 102

10.3 Quality of the reference trajectory ............................................................................................................. 105

10.4 Availability, regularity of the of the DUT’s outputs for the metrics computations ...................................... 106

10.5 Statistic representability of the results ....................................................................................................... 108

11 Definition of the synthesis report: how to report the results of the tests? .......................... 112

11.1 General....................................................................................................................................................... 112

11.2 Identification of the DUT ............................................................................................................................ 112

11.3 Identifications of test ................................................................................................................................. 112

11.4 Personal responsible of tests ...................................................................................................................... 113

11.5 Identification of the tests stimuli ................................................................................................................ 113

11.6 Report of tests conditions .......................................................................................................................... 113

11.7 Identification of the files including the test raw data ................................................................................. 113

11.8 Identification of tools used in post processing for computing the metrics .................................................. 113

11.9 Results ........................................................................................................................................................ 113

11.10 Date of report, responsible and contact, lab address and signatures ..................................................... 114

Annex A (normative) ETSI test definition for GBLS .................................................................................. 115

A.1 Synthetic reporting of the ETSI specification for the operational environment and tests

conditions ................................................................................................................................................................. 115

Annex B (normative) Detailed criteria for testing strategy ................................................................... 126

---------------------- Page: 6 ----------------------
kSIST-TP FprCEN/TR 17465:2020
FprCEN/TR 17465:2019 (E)

B.1 First criteria: trust in metrology ......................................................................................................... 126

B.1.1 Reproducible results ........................................................................................................................... 126

B.1.2 Representative and meaningful results for the road applications ..................................... 128

B.1.3 Reliable procedure for the metric assessment .......................................................................... 128

B.2 Second criteria: Reasonable cost for manufacturers ................................................................... 128

B.2.1 Cost of test benches .............................................................................................................................. 128

B.2.2 Cost of the test operations ................................................................................................................. 129

B.2.3 Additional costs ..................................................................................................................................... 129

B.3 Third criteria: clear responsibility and sharing between actors ............................................ 130

Annex C (normative) Size of data collection (GMV contribution) ....................................................... 132

C.1 Data campaign definition: sample size ............................................................................................. 132

C.2 Statistical significance ............................................................................................................................. 133

Bibliography ............................................................................................................................................................ 137

---------------------- Page: 7 ----------------------
kSIST-TP FprCEN/TR 17465:2020
FprCEN/TR 17465:2019 (E)
European foreword

This document (FprCEN/TR 17465:2019) has been prepared by Technical Committee CEN/TC 5

“Space”, the secretariat of which is held by DIN.
This document is currently submitted to the Vote on TR
---------------------- Page: 8 ----------------------
kSIST-TP FprCEN/TR 17465:2020
FprCEN/TR 17465:2019 (E)
1 Scope

This document is the output of WP1.2 “Field test definition for basic performances” of the GP-START

project.

The GP-START project aims to prepare the draft standards CEN/CENELEC/TC5 16803-2 and 16803-3

for the Use of GNSS-based positioning for road Intelligent Transport Systems (ITS). Part 2: Assessment of

basic performances of GNSS-based positioning terminals is the specific target of this document.

This document constitutes the part of the Technical Report on Metrics and Performance levels detailed

definition and field test definition for basic performances regarding the field tests definition.

The purpose of WP1.2 is to define the field tests to be performed in order to evaluate the

performances of road applications’ GNSS-based positioning terminal (GBPT). To fully define the tests,

this task addresses the test strategy, the facilities to be used, the test scenarios (e.g. environments and

characteristics, which should allow the comparison of different tests), and the test procedures. The

defined tests and process will be validated by performing various in-field tests. The defined tests focus

essentially on accuracy, integrity and availability as required in the statement of work included in the

invitation to tender.
This document will serve to:

• the consolidation of EN 16803-1: Definitions and system engineering procedures for the

establishment and assessment of performances;

• the elaboration of EN 16803-2: Assessment of basic performances of GNSS-based positioning

terminals;

• the elaboration of EN 16803-3: Assessment of security performances of GNSS-based positioning

terminals.
The document is structured as follows:
• Clause 1 is the present Scope;
• Clause 5 defines and justifies the global strategy for testing;
• Clause 6 defines and justifies the retained operational scenario;
• Clause 7 defines the metrics and related tools;
• Clause 8 defines the required tests facilities;
• Clause 9 defines the tests procedures;
• Clause 10 defines the validation procedures;
• Clause 11 defines how to report the tests results.
---------------------- Page: 9 ----------------------
kSIST-TP FprCEN/TR 17465:2020
FprCEN/TR 17465:2019 (E)
2 Normative references

The following documents are referred to in the text in such a way that some or all of their content

constitutes requirements of this document. For dated references, only the edition cited applies. For

undated references, the latest edition of the referenced document (including any amendments)

applies.

EN 16803-1:2016, Space — Use of GNSS-based positioning for road Intelligent Transport Systems (ITS)

— Part 1: Definitions and system engineering procedures for the establishment and assessment of

performances
3 Terms and definitions
No terms and definitions are listed in this document.

ISO and IEC maintain terminological databases for use in standardization at the following addresses:

• ISO Online browsing platform: available at http://www.iso.org/obp
• IEC Electropedia: available at http://www.electropedia.org/
4 List of acronyms
GNSS Global navigation satellite system
▪ GPS Global positioning system
▪ SBAS Satellite based augmentation system
▪ COTS Commercial on the shelves
▪ GBPT GNSS based positioning terminal
▪ OTS On the shelves
ITS Intelligent transport systems
ETSI European telecommunications standards institute
A-GNSS Assisted GNSS
FAR False alarm rate
▪ PFA Probability of false alarm
▪ PMD Probability of miss detection
▪ PPK Post processing kinematic
▪ AIA Accuracy, integrity, availability
▪ SW Software
▪ LoS Line of Sight
---------------------- Page: 10 ----------------------
kSIST-TP FprCEN/TR 17465:2020
FprCEN/TR 17465:2019 (E)
5 Definition of the general strategy: what kind of tests?
5.1 General

The technical solutions for ITS (road environment), focused in the targeted standard, are more and

more complex.

One consequence is that their performances and behaviours will no more only depend on their design

but also, and strongly, depend on a lot of external situations and parameters, uncontrolled by the

stakeholders. Among those parameters, we can quote the dependencies on the status of international

worldwide space systems (GNSS), on physical atmospheric conditions, and other environmental

conditions in the proximity of the vehicle (traffic, tree foliage, buildings in vicinity etc.).

As an example, this situation implies that any realization of one field test procedure of a given product

at a given date and hour, will give a different result than the same test procedure of the same product

in the same location at a different date and hour (neither ergodic nor stationary stochastic process).

The obvious consequence is that, if a pure field test strategy is targeted as a preferred solution for the

performance assessment aiming homologation of devices, the analysis of the tests results would

require specialists, and may frequently result in intangible and unreliable interpretations, the opposite

of metrology.

A solution to avoid this issue is to have a total trust in simulations where all the tests conditions are

controlled and which could be perfectly repeatable. ETSI addressed a similar issue during its

standardization process targeting the GNSS based Location Based Services (See ETSI TS 103 246-1, −2,

−3, −4, −5). As a conclusion of its work, ETSI, selected a solution exclusively based on simulations (see

Annex A).

Considering that the real-life environment remains complex to be simulated, the pure simulation

technique will lead to scenarios with a very great number of parameters to be set-up, inducing risk of

human manipulation errors, and anyway a remaining lack of representation of the reality.

New paradigms have to be seriously considered, and this Clause 5 aims to open solutions by analysing

the best way to select and phase the tests to be performed in a standardized performance assessment.

5.2 GBPT characterization
5.2.1 An hybrid and heterogenic system

According to Figure 3 of (see EN 16803-1), Positioning-based road ITS system is the integration of the

GBPT into the road ITS application. Moreover, GBPT is presented also as a complex assembly of

sensors, with multiple interfaces with external systems.

The positioning level, focused in this part of document for the definition of tests, is still an assembly of

more or less complex components where at least one (1) component is a GNSS sensor.

The generic architecture of a Road ITS system ((EN 16803-1), Figure 4) shows directly that the

evaluation of the metrics related to the positioning (accuracy, integrity, availability) will be complex,

since it:

• covers intermediate outputs (position, speed) of a global integrated system, likely not easy to

capture in some future finally packaged and installed products: specific prescriptions and

communication protocols should be standardized;

• depends on worldwide and independently evolving infrastructures, namely GNSS infrastructure and

telecommunication networks, interacting each other’s (Assisted, Differential GNSS) and with the

system itself, and in particular influencing strongly its performances;
---------------------- Page: 11 ----------------------
kSIST-TP FprCEN/TR 17465:2020
FprCEN/TR 17465:2019 (E)

• covers sensing of the external environment and consequently depends on a huge number of

external environmental conditions (radio propagation for GNSS and telecommunications, light, fog

and dust for cam and LIDAR, etc.);

• covers in the same time sensing of the motion of the vehicle through odometers and inertial

sensors and consequently depends on the vehicle and its driving as well as additional external

environmental conditions (ex: meteorological, or road-holding for odometer).

In EN 16803-1:2016, Clause 8 presents a long (even if not exhaustive) list of parameters which should

impact the definition of the tests.

Synthesizing together, namely in an integrated lab test facility, the effects of worldwide radio

infrastructures (as existing currently or evolving in the future), their local radio propagation in road

environment, the motion sensing by inertial sensors, and others phenomena like climatic for odometer

or imaging for computer vision is today unfeasible.
Today, the l
...

Questions, Comments and Discussion

Ask us and Technical Secretary will try to provide an answer. You can facilitate discussion about the standard in here.