CEN/TR 15449-5:2015
(Main)Geographic information - Spatial data infrastructures - Part 5: Validation and testing
Geographic information - Spatial data infrastructures - Part 5: Validation and testing
This part of the Technical Report provides guidance for validation and testing of data, metadata and services, as the main Spatial Data Infrastructure (SDI) components defined in other parts of CEN/TR 15449.
The guidance is given by means of examples of the validation and testing process required to assure conformance with the requirements existing in the relevant standards and guidelines.
Geoinformationen - Geodateninfrastrukturen - Teil 5: Validierung und Tests
Information géographique - Infrastructure de données spatiales - Partie 5 : Validation et essais
Geografske informacije - Infrastruktura za prostorske podatke - 5. del: Validacija in preskušanje
Ta del tehničnega poročila podaja napotke za validacijo in preskušanje podatkov, metapodatkov in storitev kot glavnih komponent infrastrukture za prostorske podatke, določene v drugih delih standarda CEN/TR 15449.
Napotki so podani s primeri validacijskih in preskusnih postopkov, potrebnih za zagotavljanje skladnosti z obstoječimi zahtevami v ustreznih standardih in smernicah.
General Information
Standards Content (Sample)
SLOVENSKI STANDARD
01-december-2016
Geografske informacije - Infrastruktura za prostorske podatke - 5. del: Validacija in
preskušanje
Geographic information - Spatial data infrastructures - Part 5: Validation and testing
Geoinformationen - Geodateninfrastrukturen - Teil 5:Validierung und Tests
Information géographique - Infrastructure de données spatiales - Partie 5 : Validation et
essais
Ta slovenski standard je istoveten z: CEN/TR 15449-5:2015
ICS:
07.040 Astronomija. Geodezija. Astronomy. Geodesy.
Geografija Geography
35.240.70 Uporabniške rešitve IT v IT applications in science
znanosti
2003-01.Slovenski inštitut za standardizacijo. Razmnoževanje celote ali delov tega standarda ni dovoljeno.
TECHNICAL REPORT
CEN/TR 15449-5
RAPPORT TECHNIQUE
TECHNISCHER BERICHT
April 2015
ICS 07.040; 35.240.70
English Version
Geographic information - Spatial data infrastructures - Part 5:
Validation and testing
Information géographique - Infrastructure de données
spatiales - Partie 5 : Validation et essais
This Technical Report was approved by CEN on 21 June 2014. It has been drawn up by the Technical Committee CEN/TC 287.
CEN members are the national standards bodies of Austria, Belgium, Bulgaria, Croatia, Cyprus, Czech Republic, Denmark, Estonia,
Finland, Former Yugoslav Republic of Macedonia, France, Germany, Greece, Hungary, Iceland, Ireland, Italy, Latvia, Lithuania,
Luxembourg, Malta, Netherlands, Norway, Poland, Portugal, Romania, Slovakia, Slovenia, Spain, Sweden, Switzerland, Turkey and United
Kingdom.
EUROPEAN COMMITTEE FOR STANDARDIZATION
COMITÉ EUROPÉEN DE NORMALISATION
EUROPÄISCHES KOMITEE FÜR NORMUNG
CEN-CENELEC Management Centre: Avenue Marnix 17, B-1000 Brussels
© 2015 CEN All rights of exploitation in any form and by any means reserved Ref. No. CEN/TR 15449-5:2015 E
worldwide for CEN national Members.
Contents Page
Foreword . 4
Introduction . 5
1 Scope . 7
2 Normative references . 7
3 Terms, definitions and abbreviations . 7
3.1 Terms and definitions . 7
3.2 Abbreviations . 7
4 Conformance and testing framework . 8
4.1 General . 8
4.2 Structure of the document . 9
5 Metadata validation . 10
5.1 General . 10
5.2 Validation against XML Schema . 10
5.3 Rule-based validation with Schematron . 11
6 Data validation . 11
6.1 General . 11
6.2 Validation against XML Schema . 12
6.3 Rule-based validation with Schematron . 12
6.4 INSPIRE Abstract Test Suite for Annex I, II and III data themes . 12
7 Service validation . 14
7.1 General . 14
7.2 View services . 15
7.2.1 General . 15
7.2.2 WMS . 15
7.2.3 WMTS . 17
7.3 Download services . 17
7.4 Quality of services . 19
Annex A (informative) Examples of validation processes . 21
A.1 Introduction . 21
A.2 Example of web map service validation . 21
A.3 Example of web feature service validation . 23
Annex B (informative) Example validation tools . 45
B.1 Introduction . 45
B.2 Data and metadata validation with Oxygen . 45
B.3 Validation with XMLSpy . 53
B.4 INSPIRE Validator . 56
B.4.1 Introduction . 56
B.4.2 Supported standards and technologies used . 57
B.4.3 Resource validation process . 58
B.5 OGC CITE . 60
B.6 eENVplus validation service . 60
Annex C (informative) Validation of specification encoding . 61
C.1 Introduction . 61
C.2 Validate Schema . 62
C.3 Check transposition of specification . 62
C.4 Check validatability . 62
C.5 Example of validation of specification encoding . 63
C.5.1 General . 63
C.5.2 Validate schema . 63
C.5.3 Check transposition of specification . 64
C.5.4 Check validatability . 69
C.5.5 Example of test suite for metadata rule-based validation . 75
C.5.6 Example of result of validation of the metadata specification encoding . 81
Bibliography . 84
Foreword
This document (CEN/TR 15449-5:2015) has been prepared by Technical Committee CEN/TC 287
“Geographic information”, the secretariat of which is held by BSI.
Attention is drawn to the possibility that some of the elements of this document may be the subject of patent
rights. CEN [and/or CENELEC] shall not be held responsible for identifying any or all such patent rights.
Introduction
Spatial data infrastructure (SDI) is a general term for the computerised environment for handling data that
relates to a position on or near the surface of the earth. It may be defined in a range of ways, in different
circumstances, from the local up to the global level.
This Technical Report focuses on the technical aspects of SDIs, thereby limiting the term SDI to mean an
implementation neutral technological infrastructure for geospatial data and services, based upon standards
and specifications. It does not consider an SDI as a carefully designed and dedicated information system;
rather, it is viewed as a collaborative framework of disparate information systems that contain resources that
stakeholders desire to share. The common denominator of SDI resources, which can be data or services, is
their spatial nature. It is understood that the framework is in constant evolution, and that therefore the
requirements for standards and specifications supporting SDI implementations evolve continuously.
SDIs are becoming more and more linked and integrated with systems developed in the context of e-
Government. Important drivers for this evolution are the Digital Agenda for Europe, and related policies (see
Part 1 of this Technical Report). By sharing emerging requirements at an early stage with the standardization
bodies, users of SDIs can help influence the revision of existing or the conception of new standards.
The users of an SDI are considered to be those individuals or organisations that, in the context of their
business processes, need to share and access geo-resources in a meaningful and sustainable way. Based
on platform- and vendor-neutral standards and specifications, an SDI aims at assisting organisations and
individuals in publishing, finding, delivering, and eventually, using geographic information and services over
the internet across borders of information communities in a more cost-effective manner.
Considering the complexity of the subject and the need to capture and formalize different conceptual and
modelling views, CEN/TR 15449 comprises multiple parts. The other parts, published previously, are:
— Part 1: Reference model: This provides a general context model for the other Parts, applying general IT
architecture standards;
— Part 2: Best Practice: This provides best practices guidance for implementing SDI, through the
evaluation of the projects in the frame of the European Union funding programmes.
— Part 3: Data centric view: This addresses the data, which includes application schemas and metadata.
— Part 4: Service centric view: This addresses the concepts of service specifications, the methodology for
developing service specifications through the application of the relevant International Standards, and the
content of such service specifications.
Further parts may be created in the future.
One of the major challenges in the implementation of an SDI is to ensure the conformity of its components
with the requirements specified in the relevant standards and guidelines. This applies to the data
specifications, the derived schemas, the spatial data sets and metadata and the network services. Only if
conformance is ensured, can true interoperability of the harmonized metadata and data by means of network
services be guaranteed. This Part (5) provides guidance for validation and testing of data, metadata and
services, as the main Spatial Data Infrastructure (SDI) components defined in other parts of this Technical
Report.
The intended readers of this document belong to a range of categories:
— technicians engaged in validation and testing of SDI components, who need to find reference material to
use within the validation and testing processes;
— managers who need to assess the complexity of the processes of validation and testing of SDI
components;
— data, metadata and network service providers, aiming at self-validating their own data sets, metadata or
services, who wish to implement validation and testing processes within their organizations;
— designers of data and metadata models, who need to validate their schemas;
— data users interested in acquiring a deeper knowledge about validation and testing processes of SDI
components.
Because the operation of SDIs in Europe is governed by the INSPIRE Directive EC/02/2007 and its relevant
legal and technical documents, this report aims at considering INSPIRE as the reference context, even
though some concepts, wherever possible, are generalized beyond INSPIRE.
Because Validation and Testing is a subject in continuous and rapid evolution, and many different
implementations could exist based on different technical solutions, the topics covered in this report, as well
as the relevant examples provided in the Annexes, cannot be considered complete nor exhaustively
presented. In order to keep updated on the subject, the readers of this report are recommended to follow the
activities and outcomes of the Working Group 5 “Validation and Conformity Testing”, set-up within the
INSPIRE MIG (Maintenance and Implementation Group) and MIF (Maintenance and Implementation
)
Framework) .
1) http://inspire.ec.europa.eu/index.cfm/pageid/5160
1 Scope
This part of the Technical Report provides guidance for validation and testing of data, metadata and
services, as the main Spatial Data Infrastructure (SDI) components defined in other parts of the
CEN/TR 15449.
The guidance is given by means of examples of the validation and testing process required to ensure
conformance with the requirements existing in the relevant standards and guidelines.
The National validation and testing context is out of scope of this report.
2 Normative references
The following documents, in whole or in part, are normatively referenced in this document and are
indispensable for its application. For dated references, only the edition cited applies. For undated references,
the latest edition of the referenced document (including any amendments) applies.
EN ISO 19105:2005, Geographic information - Conformance and testing (ISO 19105:2000)
3 Terms, definitions and abbreviations
3.1 Terms and definitions
For the purposes of this document, the terms and definitions of EN ISO 19105:2005 shall apply.
3.2 Abbreviations
ATS: Abstract Test Suite
CRS: Coordinate Reference System
DS: Data Specifications
ESDIN: European Spatial Data Infrastructure with a Best Practice Network — a project supported by
eContent+ programme
ETF: ESDIN Testing Framework
ETS: Executable Test Suite
FE: Filter Encoding
GI: Geographic Information
GML: Geography Markup Language
ISO: International Organization for Standardization
IR: Implementing Rule
MD: Metadata
NA: Not Applicable
NS: Network Services
OGC: Open Geospatial Consortium
PS: Protected Sites
SLA: Service Level Agreement
SOAP: Simple Object Access Protocol
XML: eXtensible Markup Language
xsd: XML Schema Definition
XSLT: Extensible Stylesheet Language Transformations
W3C: World Wide Web Consortium
WFS: Web Feature Service
WMS: Web Map Service
WMTS: Web Map Tiles Services
4 Conformance and testing framework
4.1 General
The implementation of rule-based validation requires the translation of the sometimes textually defined rules
into a machine readable format, in other words you need a formal rules language. The rules language within
a testing environment offers a mechanism for recording conceptual level data management logic. The
required characteristics are given in Table 1.
Table 1 — Characteristics of a rules language for data quality evaluation
Language is naturalistic and easy to learn and
Intuitive
use (by data experts).
Concise grammar for manageability and ease
Compact
of comprehension.
Domain constraints can be expressed
Unambiguous
mathematically.
Reports formal metrics to assert the evaluated
Quantitative level of compliance to data quality rules
(measures).
Logical separation between application domain
and physical implementation models. Allows
rules to be expressed in terms of application
Portable
schema, which is easier for data experts to
relate to and is likely to have a longer life-time
than specific technology implementations.
The language and environment in which it is
Web Enabled used shall be compatible with distributed data,
and web-based interfaces.
Continuous improvement cycles require rules
to evolve with time. It is also unusual to be able
Extensible to capture all data requirements up front and
over time these requirements are likely to be
extended or subject to change.
There are several rules languages which can be used for this purpose. The choice of which to use often
depends on the chosen test engine within the testing environment. Since the data tests are usually executed
on a XML file, two main options are considered:
• a XML database (e.g. BaseX) and a XML based query language (e.g. XPath/XQuery) are used to run
the translated test criteria and perform comprehensive analyses. To develop or extend existing test
criteria an interactive graphical user interface is useful.
• the Schematron language is used to translate the necessary formal constraints.
Testing of geometrical criteria needs additional software components (e.g. JTS – Java Topology Suite, Vivid
Solutions).
In any case it is essential to know which criteria have been tested and which criteria shall be successfully
tested to consider the data set as conformant. Therefore a central repository might be useful to control the
content, the versions and changes of the test criteria.
The following sections and examples of this document relevant to data and metadata validation and testing
refer to the Schematron option above mentioned. More details about schematron can be found at:
• ISO/IEC 19757-3:2006, Information technology - Document Schema Definition Languages (DSDL) –
Part 3: Rule-based validation – Schematron
• www.schematron.com
4.2 Structure of the document
In the following subsections a step by step process is described covering the validation process for:
• data/metadata encoding (according to the specification),
• network services.
For the validation of data/metadata encoding, the following steps are covered:
• Schema Validation: validation of the metadata or data documents against the corresponding XML
schema;
• Schematron Validation: the validation of non-syntactic requirements using semantic rules defined in
Schematron.
Validation approaches alternative to implementations based on XML schema and/or Schematron are
provided as well.
For the validation of network services (view, download, coverage), the following types of services are
covered:
• Functional requirements,
• Quality (performance) aspects.
Examples of validation processes are provided in Annex A, while examples of the use of validation tools are
provided in Annex B.
In Annex C a validation process is provided for metadata and data specification encoding, covering the
following steps:
• Schema Validation: checks that the schema that has been derived from the data model is a valid
)
schema according to the W3C XML Schema Recommendation ;
2) The phrase ‘schema validation’ is generally used to describe validating an XML instance against its schema (from the
W3C http://www.w3.org/TR/xmlschema11-1/) In that document, it is clearly distinct from assuring that the schema is itself
valid – that is described in a section headed ‘Errors in Schema Construction and Structure’. In
http://www.w3.org/TR/xmlschema11-1/, any software that goes on to check an XML instance against a schema is
allowed to declare an error if the schema itself is invalid.
• Validation of Transposition: checks that all elements from the Data Specification have been properly
transposed to XML Schema;
• Validatability: checks that all elements from the Data Specification have been transposed to the XML
Schema in a manner that allows for correct semantic interpretation and validation.
The validation process described in Annex C may be useful for those involved in the development of new or
extension of existing Data Specifications.
5 Metadata validation
5.1 General
The metadata validation process encompasses several steps. The validation process for metadata validation
is shown in Figure 1. As a first step, the metadata shall be validated against the metadata schema provided.
As a second step, the metadata shall be validated against formal constraints from the metadata specification
using Schematron. In Figure 1, relevant to an INSPIRE metadata validation, this second step has been split
into two sub-steps, the first relevant to the validation with respect to the constraints related to the core
metadata common to all the INSPIRE themes and then with respect to the theme-specific constraints.
Figure 1 — Metadata validation process
If an error is found and rectified, all validation steps shall be repeated, as the rectification process may have
introduced new errors.
5.2 Validation against XML Schema
Before using an XML document, this shall first be validated against the relevant schema using an XML
validation tool. Any errors identified during schema validation shall be rectified before the data can be used.
The tester shall ensure that the XML document refers to the proper schema; just because a data file is valid
XML does not mean that it is valid according to a specific schema.
As many tools used for the creation and validation of xml metadata may use different approaches, it is
advisable to validate the metadata generated using different validation tools. Thus, further errors in the
metadata can be found and rectified, reducing problems encountered when users attempt to work with faulty
metadata.
5.3 Rule-based validation with Schematron
Once the metadata file has been shown to be formally valid against the schema provided, compliance to
further constraints arising from the data specification shall be checked. This should be done using
Schematron validation; a Schematron rule file shall be provided for each metadata specification. All
metadata provided in the form of XML documents shall pass validation according to the corresponding
Schematron file before it is considered valid. While in most cases the Schematron rules are configured so
that they only display message texts for errors, some also provide informative texts.
6 Data validation
6.1 General
The data validation process encompasses several steps, as shown in Figure 2, relevant to an INSPIRE data
validation. As a first step, the data should be validated against the data schema provided. As a second step,
the data should be validated against formal constraints from the data specification using Schematron, first
with respect to the constraints related to the GML encoding and then with respect to the data-specific
constraints.
Figure 2 — Data validation process
If an error is found and rectified, all validation steps shall be repeated, as the rectification process may have
introduced new errors.
6.2 Validation against XML Schema
Before using an XML document, this shall first be validated against the relevant schema for the data file
using an XML validation tool. Any errors identified during schema validation shall be rectified before the data
can be used. The tester shall ensure that the XML document refers to the proper schema; just because a
data file is valid XML does not mean that it is valid according to a specific schema.
As many tools used for the creation and validation of xml data may use different approaches, it is advisable
to validate the data generated using different validation tools. Thus, further errors in the data may be found
and rectified, reducing problems encountered when users attempt to work with faulty data.
6.3 Rule-based validation with Schematron
Once the data file has been shown to be formally valid against the schema provided, compliance to further
constraints arising from the data specification shall be checked. This should be done using Schematron
validation; a Schematron rule file shall be provided for each metadata and data specification. All data
provided in the form of XML documents shall pass validation according to the corresponding Schematron file
before it is considered valid. While in most cases the Schematron rules are configured so that they only
display message texts for errors, some also provide informative texts.
6.4 INSPIRE Abstract Test Suite for Annex I, II and III data themes
The INSPIRE Data Specifications - Technical Guidelines for the Annex I, II and III data themes have an
annex, “Annex A - Abstract Test Suite”, which contains a set of tests to help the conformance testing
process.
The new and updated structure of the Data Specifications contains two different types of requirements:
• the requirements present in the Regulation (Implementing Rules - IR) on interoperability of spatial data
sets and services (IR Requirements)
• the requirements for a specific technical solution proposed in the Technical Guidance for an IR
requirement (TG requirements).
For this reason the Abstract Test Suite (ATS) is composed of two parts:
• Part 1 (normative) - Conformity with Commission Regulation No 1089/2010 (IR Requirements)
• Part 2 (informative) - Conformity with the technical guideline (TG) Requirements
In each part, the requirements to be tested are grouped in several conformance classes and each of these
classes covers a specific aspect.
The ATS contains a detailed list of abstract tests, but for their physical implementation an Executable Test
Suite (ETS) is required. The ETS shall contain operative instructions on how to execute the relevant abstract
test.
Some tests, such as in the Application schema conformance class, may be automated by using xml schema
validation tools. Conversely, other tests can require manual execution.
)
An overview of the INSPIRE ATS for Data Specifications is provided in Figure 3 .
3) Abstract Test Suite for INSPIRE Data Specifications, Vlado Cetl, Katalin Tóth, Tomas Reznik and Robert Tomas,
INSPIRE conference 2012
Figure 3 — Overview of the INSPIRE ATS for Data Specifications
An example of a test (A.1.1) of the Application Schema Conformance Class (A.1) for the INSPIRE
Environmental Monitoring Facilities data theme is shown in Figure 4.
Figure 4 — Example Test for Application Schema
A possible ETS for the test A.1.1 is the validation described in 6.2 "Validation against XML Schema” in this
document.
Another example of a test (A.5.1) of the Metadata IR Conformance Class (A.5) for the INSPIRE
Environmental Monitoring Facilities data theme is shown in Figure 5.
Figure 5 — Example Test for Metadata
A possible ETS for the test A.5.1 showed in Figure 5 could be a manual test, for example the inspection of a
Discovery Service to search the metadata related to the data set under test.
In reference to the INSPIRE Data Specifications ATS, Annex C covers the schema validation process (not
covered by the INSPIRE ATS). This represents an important step the data providers may wish to perform to
validate the xsd before using it as reference schema for validating the data sets or metadata outside an
INSPIRE context.
7 Service validation
7.1 General
Many types of services can be distinguished in an SDI. Part 1 of this Technical Report describes the service
taxonomy including Interaction services, Composition and Orchestration, Processing Services, Data and
Things Management services, Communication Services, System and Security Management Services. All
these services provide different types of interaction between users, applications, portals and data, metadata
and registers. Specific examples are services for catalogues, registers, gazetteer, feature transformation,
feature access, viewing. Although all of these services can be or are more or less relevant for validation this
clause focuses on viewing and download services. Validation of services can be based on functional
requirements and on quality or performance requirements. The first clauses address the functional
requirements of validation. 7.4 is about quality aspects. The specific chapters will provide examples of
)
operational test suites. One that is overall relevant is the OGC Compliance Testing Program (CITE) , an
OGC program which provides the resources, procedures and policies to assess compliance with OGC
standards of which several are on service validation.
4) http://cite.opengeospatial.org/cite
7.2 View services
7.2.1 General
View services facilitate interaction between a user and the data related to display, navigate, zoom, pan or
overlay viewable spatial data sets, and to display legend information and any relevant content of metadata.
7.2.2 WMS
A Web Map Service (WMS) produces maps of spatially referenced data dynamically from geographic
information. The data itself are not exchanged but usually rendered in an image format such as PNG, GIF,
JPEG, or SVG. The relevant standard in viewing services is EN ISO 19128 Geographic information - Web
Map Server Interface. EN ISO 19128 divides the WMS in a Basic WMS and a queryable WMS of which the
former will only support portrayal while the latter also includes information queries.
A WMS establishes a communication between a server containing the data and a client, including a user
interface. Basically the communication consists of request and response statements. These statements
provide the language that is exchanged and that is subject to standardization. As such both sides of the
communication channel may be subject to validation against a standard.
In this clause we will not consider testing of WMS client statements as they are not part of the controlled
components of an SDI. For the server side EN ISO 19128 presents several abstract testing rules that are
copied below.
WMS Abstract test rules include:
• Version negotiation: Verify that a basic WMS server satisfies the requirements for version negotiation.
• Request parameter rules: Verify that a basic WMS server satisfies the requirements for request
parameter rules.
• GetCapabilities response: Verify that a basic WMS server satisfies all requirements of the
GetCapabilities operation.
• GetMap response: Verify that a basic WMS client satisfies all requirements of the GetMap operation.
• GetFeatureInfo response: Verify that a Web Map Service interface satisfies all requirements for the
operation.
These abstract test rules are made operational in several tools that can be applied in executable test suites.
Examples are in the ESDIN Testing Framework (ETF) that was developed for testing services within the
INSPIRE program. A.2 shows the executable test rules for WMS 1.3.0 in the ETF. From that example only
the GetCapabilities response from the ATS is presented below.
EXAMPLE: Executable test rule for INSPIRE GetCapability Parameters:
Test Mandatory INSPIRE GetCapabilities Parameters:
1. Response SLA: Capabilities response in time (within 5000 ms)
2. Capabilities validate to INSPIRE Schema View Services (the XSD). Note that INSPIRE elements from INSPIRE
schemas are checked by schema validation to the XSD:
3. INSPIRE MandatoryKeyword infoMapAccessService is present
4. INSPIRE GetMap Supports PNG or GIF
5. INSPIRE Resource Title: all Layers with a Name have a Title
6. INSPIRE Resource Abstract: all Layers with a Name have an Abstract
7. INSPIRE Resource Keyword: all Layers with a Name have at least one Keyword
8. INSPIRE Ex_GeographicBoundingBox: all Layers with a Name have a EX_GeographicBoundingBox
9. INSPIRE BoundingBox: all Layers with a Name have BoundingBoxes for all advertized CRSes
10. INSPIRE Resource Identifier: all Layers with a Name have an Identifier and a declared Authority for that
Identifier
11. INSPIRE Styles: all Styles have a Name and Title
12. INSPIRE ResponseLanguage present
13. INSPIRE DefaultLanguage present
Test Optional GetCapabilities Parameters:
1. Response SLA: Capabilities response in time (within 5000 ms)
2. INSPIRE there is a harmonized Layer Name available
3. INSPIRE Coordinate Reference System 4258 in Layer or group Layer
An example of a WMS validator for testing against a national profile of OGC WMS standard conformity can
be found at http://services.geonovum.nl/check-wms/. This WMS validator of Geonovum is not a full test suite,
but focuses on the requirements of the Dutch profile for OGC WMS 1.3.0. A screenshot is shown in Figure 6.
Figure 6 — Screen shot of WMS validation report
7.2.3 WMTS
Web Map Tiles Services (WMTS) is an OGC specification for web map services by serving map tiles.
Web Map Tile Services validation are similar to WMS with the exchange of the GetMap for the GetTile
request. Similar to WMS, Geonovum has extended the ETF with tests for WMTS implementations for
INSPIRE View Services. The ETF is able to test the INSPIRE requirements laid down for Capabilities
documents and the GetTile request of WMTS.
7.3 Download services
Download services enable copies of spatial data sets or parts of such sets, to be downloaded and, where
practicable, accessed directly. This includes spatial analysis and optional modification at feature and
attribute level. The access protocol is therefore more extended than the WMS. Consequently, testing is more
elaborated. Several types of WFS implementation are considered, each corresponding to a specific
conformance class.
The WFS abstract test rules related to the conformance classes are the following (from OGC WFS 2.0,
EN ISO 19142):
Simple WFS: Includes: GetCapabilities, DescribeFeatureType, ListStoredQueries,
DescribeStoredQueries, GetFeature operation with only the StoredQuery action. One stored query, that
fetches a feature using its id, shall be available but the server may also offer additional stored queries.
Additionally the server shall conform to at least one of the HTTP GET, HTTP POST or SOAP
conformance classes.
This conformance class already includes many test rules. These are not included in this technical report.
An example is the testing for unique identification of featuretypes using the following method: Create a
new feature instance and verify that an identifier is assigned to that feature in the transaction response.
Using that identifier, query to server to retrieve the feature. Verify that only a single feature is returned in
the response that corresponds to the queried identifier. Verify that the identifier is encoded in the
response using the gml:id attribute.
Basic WFS: Includes Simple WFS conformance plus GetFeature operation with the Query action and
the GetPropertyValue operation.
Transactional WFS: Includes: Basic WFS plus the Transaction operation.
Locking WFS: Includes Transactional WFS plus at least one of the GetFeatureWithLock or
LockFeature operations.
The extended list of request includes: HTTP GET, HTTP POST, SOAP, Inheritance, Remote resolve,
Response paging, Standard joins, Spatial joins, Temporal joins, Feature versions, Manage stored queries.
Each of these have specific test rules.
An example of an executable test suite for Direct Access WFS Download Service is provided in A.3. From
that example three test lines are presented in Table 2, retrieval of a spatial data set and describing a spatial
data set and a spatial object type, from INSPIRE Network Services.
Table 2 — Example of three WFS Operations and relevant ISO implementations
M/O/C
Function/ Mandatory/
Description in IR Recommended WFS-based implementation
Operation Optional/
Conditional
WFS/FE
Request Response Conformance
Classes
The Get Spatial Data Set
Pre-defined spatial data
operation allows the retrieval
sets in different
of a Spatial Data set. The WFS shall
CRS/DataSetIdCode/
return a set of
DataSetIdNamespace/la
Request parameters
features
nguage combinations EN ISO 19142:
- Language corresponding
can be retrieved using Simple WFS,
Get Spatial - Spatial Data Set Identifier to the
Stored Queries as HTTP Get
M
Data Set - Coordinate Reference predefined
described in Section 6.4
EN ISO 19143:
System data set in the
A GetFeature request Query
requested
shall be made to a WFS
Response parameters
language and
that uses a StoredQuery
- Requested Spatial Data CRS.
for the pre-defined data
Set in the requested
set.
language and CRS
This operation returns the
The WFS shall
description of all the types of
return a valid
Spatial Objects contained in
Capabilities
The spatial object types
the Spatial Data set.
document in
are described in the
the
GetCapabilities
Describe EN ISO 19142:
Request parameters
requested
response of the WFS.
Spatial Data M Simple WFS,
- Language
language,
set HTTP Get
- Spatial Data Set Identifier A GetCapabilities
which
request is made to a
identifies the
Response parameters WFS.
Spatial Object
- Description of the Spatial
types
Objects in the requested
available.
Spatial Data Set and
requested language.
This operation returns the
description of the specified
Spatial Objects types.
The WFS
Request parameters
O responds with
Describe - Language (Direct A DescribeFeatureType the XML EN ISO 19142:
Spatial - Spatial Object Type access request is made to the schema for the Simple WFS,
Object Type download WFS. requested HTTP Get
Response parameters
only) Spatial Object
- Description of the Spatial types
Object Type in conformity
with regulation (EU)
No.1089/2010
An example of an implementation of the Describe Spatial Object Type executed on a NATURE2000 WFS is
given in Table 3.
Table 3 — Example of an implementation of the Describe Spatial Object Type
Operation under test Test procedure Example of test result
Submit to the server the following requests:
Verify the operation http://hostname:port/path?service=wfs&acceptversions=2
DescribeFeatureType.jpg (see A.3)
DescribeFeatureType .0.0&request=DescribeFeatureType, and verify that the
response is correct.
This results in a WFS response describing name and definition of ProtectedSite, FundingSourceType,
projectName and fundingType:
…./lines excluded/….
namespace="urn:x-inspire:specification:gmlas:BiogeographicalRegions:0.0"/>
name="ProtectedSite">
-- Definition --An area designated or managed within a framework of international, Community and Member States' legislation to
achieve specific conservation objectives.-- Description --Each protected site has a boundary defined through formal, legal or
administrative agreements or decisions. The establishment of a protected site is normally underpinned by legislation and thus
given weight in decisions about land use change and spatial planning. Each Site is normally selected as a representative example
of a wider resource and selected through a formal criterion based approach. A protected site can be a contiguous extent of
land/sea or a collection of discrete areas that together represent a single formal Protected Site. This class has the attributes,
constraints and associations that are part of the Full application schema.
name="FundingSourceType">
-- Definition --The source(s) of financial support that are being used to implement the management plan on a protected site.--
Description --NOTE Funding of management on protected sites is critical to securing desired conservation status. The resources
are supplied from a variety of sources, ranging from private land owners to European fund
...








Questions, Comments and Discussion
Ask us and Technical Secretary will try to provide an answer. You can facilitate discussion about the standard in here.
Loading comments...