ETSI GR MEC-DEC 025 V2.1.1 (2019-06)
Multi-access Edge Computing (MEC); MEC Testing Framework
Multi-access Edge Computing (MEC); MEC Testing Framework
DGR/MEC-DEC25TestingFramework
General Information
Standards Content (Sample)
GROUP REPORT
Multi-access Edge Computing (MEC);
MEC Testing Framework
Disclaimer
The present document has been produced and approved by the Multi-access Edge Computing (MEC) ETSI Industry
Specification Group (ISG) and represents the views of those members who participated in this ISG.
It does not necessarily represent the views of the entire ETSI membership.
2 ETSI GR MEC-DEC 025 V2.1.1 (2019-06)
Reference
DGR/MEC-DEC25TestingFramework
Keywords
MEC, testing
ETSI
650 Route des Lucioles
F-06921 Sophia Antipolis Cedex - FRANCE
Tel.: +33 4 92 94 42 00 Fax: +33 4 93 65 47 16
Siret N° 348 623 562 00017 - NAF 742 C
Association à but non lucratif enregistrée à la
Sous-Préfecture de Grasse (06) N° 7803/88
Important notice
The present document can be downloaded from:
http://www.etsi.org/standards-search
The present document may be made available in electronic versions and/or in print. The content of any electronic and/or
print versions of the present document shall not be modified without the prior written authorization of ETSI. In case of any
existing or perceived difference in contents between such versions and/or in print, the prevailing version of an ETSI
deliverable is the one made publicly available in PDF format at www.etsi.org/deliver.
Users of the present document should be aware that the document may be subject to revision or change of status.
Information on the current status of this and other ETSI documents is available at
https://portal.etsi.org/TB/ETSIDeliverableStatus.aspx
If you find errors in the present document, please send your comment to one of the following services:
https://portal.etsi.org/People/CommiteeSupportStaff.aspx
Copyright Notification
No part may be reproduced or utilized in any form or by any means, electronic or mechanical, including photocopying
and microfilm except as authorized by written permission of ETSI.
The content of the PDF version shall not be modified without the written authorization of ETSI.
The copyright and the foregoing restriction extend to reproduction in all media.
© ETSI 2019.
All rights reserved.
TM TM TM
DECT , PLUGTESTS , UMTS and the ETSI logo are trademarks of ETSI registered for the benefit of its Members.
TM TM
3GPP and LTE are trademarks of ETSI registered for the benefit of its Members and
of the 3GPP Organizational Partners.
oneM2M™ logo is a trademark of ETSI registered for the benefit of its Members and
of the oneM2M Partners. ®
GSM and the GSM logo are trademarks registered and owned by the GSM Association.
ETSI
3 ETSI GR MEC-DEC 025 V2.1.1 (2019-06)
Contents
Intellectual Property Rights . 5
Foreword . 5
Modal verbs terminology . 5
1 Scope . 6
2 References . 6
2.1 Normative references . 6
2.2 Informative references . 6
3 Definition of terms, symbols and abbreviations . 7
3.1 Terms . 7
3.2 Symbols . 7
3.3 Abbreviations . 7
4 Testing Methodology Guidelines for MEC . 8
4.1 Introduction . 8
4.2 Basic concepts for conformance and interoperability testing . 9
4.3 Conformance Test Specifications . 9
4.3.1 Introduction. 9
4.3.2 Test architecture . 10
4.3.2.1 Selection of Implementation Under Test. 10
4.3.2.1.1 Definition. 10
4.3.2.1.2 MEC IUTs and Reference Points . 11
4.3.3 Development of Conformance Test Specifications . 11
4.3.3.1 Implementation Conformance Statement (ICS) . 11
4.3.3.2 Test Suite Structure & Test Purposes (TSS&TP). 12
4.3.3.2.1 Introduction . 12
4.3.3.2.2 Test Suite Structure . 12
4.3.3.2.3 Test Purpose . 13
4.3.3.2.4 Conventions for the expected behaviour in Test Purposes . 14
4.3.3.3 Abstract Test Method (ATM) . 15
4.3.3.3.1 Methodology . 15
4.3.3.3.2 Abstract PDU Transport Protocol . 16
4.4 Interoperability Test Specifications . 16
4.4.1 Introduction. 16
4.4.2 Basic concepts for interoperability testing . 16
4.4.3 Interoperability Test Specifications . 17
4.4.4 Interoperability Testing Process . 19
4.5 Interoperability Test Process . 19
5 Requirement assessment. 20
5.1 Introduction . 20
5.2 Generic requirements . 20
5.2.1 Framework Requirements . 20
5.2.2 Application lifecycle . 20
5.2.3 Applications environment . 20
5.2.4 Support of mobility . 21
5.3 Services requirements . 21
5.3.1 Platform essential functionality . 21
5.3.1.1 Mobile edge services . 21
5.3.1.2 Connectivity . 21
5.3.1.3 Storage . 21
5.3.1.4 Traffic routing . 22
5.3.1.5 DNS support . 22
5.3.1.6 Timing . 22
5.3.2 Features . 22
5.3.2.1 Feature UserApps . 22
5.3.2.2 Feature SmartRelocation . 23
ETSI
4 ETSI GR MEC-DEC 025 V2.1.1 (2019-06)
5.3.2.3 Feature RadioNetworkInformation . 23
5.3.2.4 Feature LocationService . 23
5.3.2.5 Feature BandwidthManager . 24
5.3.2.6 Feature UEIdentity . 24
5.4 Operation and management requirements . 24
5.5 Security, regulation, charging requirements . 24
6 Architecture assessment . 25
6.1 MEC components compliance . 25
6.1.1 Mobile Edge Platform Manager. 25
6.1.2 Mobile Edge Orchestrator . 25
6.1.3 MEC Platform . 25
6.1.4 MEC Apps . 26
Annex A: Conformance Test Purposes examples . 27
A.1 Examples of test purposes . 27
A.1.1 Introduction . 27
A.1.2 Example of test purpose for querying DNS rules . 27
A.1.3 Example of test purpose for notification of an App Package enablement . 28
A.1.4 Example of test purpose for malformed App Package creation . 29
A.1.5 Example of test purpose for unauthorized creation of App Package . 30
Annex B: Interoperability Test Descriptions examples . 31
B.1 Examples of test descriptions for interoperability . 31
B.1.1 Introduction . 31
B.1.2 Example of test description for Mobile radio network information (TD_MEC002_A2) . 31
B.1.3 Example of test description for active device location tracking (TD_MEC002_A7) . 32
B.1.4 Example of test description for Edge data orchestration (TD_MEC002_A10) . 32
Annex C: Change History . 34
Annex D: Bibliography . 35
History . 36
ETSI
5 ETSI GR MEC-DEC 025 V2.1.1 (2019-06)
Intellectual Property Rights
Essential patents
IPRs essential or potentially essential to normative deliverables may have been declared to ETSI. The information
pertaining to these essential IPRs, if any, is publicly available for ETSI members and non-members, and can be found
in ETSI SR 000 314: "Intellectual Property Rights (IPRs); Essential, or potentially Essential, IPRs notified to ETSI in
respect of ETSI standards", which is available from the ETSI Secretariat. Latest updates are available on the ETSI Web
server (https://ipr.etsi.org/).
Pursuant to the ETSI IPR Policy, no investigation, including IPR searches, has been carried out by ETSI. No guarantee
can be given as to the existence of other IPRs not referenced in ETSI SR 000 314 (or the updates on the ETSI Web
server) which are, or may be, or may become, essential to the present document.
Trademarks
The present document may include trademarks and/or tradenames which are asserted and/or registered by their owners.
ETSI claims no ownership of these except for any which are indicated as being the property of ETSI, and conveys no
right to use or reproduce any trademark and/or tradename. Mention of those trademarks in the present document does
not constitute an endorsement by ETSI of products, services or organizations associated with those trademarks.
Foreword
This Group Report (GR) has been produced by ETSI Industry Specification Group (ISG) Multi-access Edge Computing
(MEC).
Modal verbs terminology
In the present document "should", "should not", "may", "need not", "will", "will not", "can" and "cannot" are to be
interpreted as described in clause 3.2 of the ETSI Drafting Rules (Verbal forms for the expression of provisions).
"must" and "must not" are NOT allowed in ETSI deliverables except when used in direct citation.
ETSI
6 ETSI GR MEC-DEC 025 V2.1.1 (2019-06)
1 Scope
The present document lists the functionalities and capabilities required by a MEC compliant implementation. In
addition, the present document specifies a testing framework defining a methodology for development of
interoperability and/or conformance test strategies, test systems and the resulting test specifications for MEC standards.
In additional, the testable requirements are listed and prioritized.
2 References
2.1 Normative references
Normative references are not applicable in the present document.
2.2 Informative references
References are either specific (identified by date of publication and/or edition number or version number) or
non-specific. For specific references, only the cited version applies. For non-specific references, the latest version of the
referenced document (including any amendments) applies.
NOTE: While any hyperlinks included in this clause were valid at the time of publication, ETSI cannot guarantee
their long-term validity.
The following referenced documents are not necessary for the application of the present document, but they assist the
user with regard to a particular subject area.
[i.1] ETSI GS MEC 012: "Multi-access Edge Computing (MEC); Radio Network Information API".
[i.2] ETSI GS NFV-TST 002: "Network Functions Virtualisation (NFV); Testing Methodology; Report
on NFV Interoperability Testing Methodology".
[i.3] ETSI GS MEC 003: "Multi-access Edge Computing (MEC); Framework and Reference
Architecture".
[i.4] ISO/IEC 9646-7:1995: "Information technology -- Open Systems Interconnection -- Conformance
testing methodology and framework -- Part 7: Implementation Conformance Statements".
NOTE: Available at https://www.iso.org/standard/3084.html.
[i.5] ISO/IEC 9646-1:1994: "Information technology -- Open Systems Interconnection -- Conformance
testing methodology and framework -- Part 1: General concepts".
NOTE: Available at https://www.iso.org/standard/17473.html.
[i.6] TTCN-3 abstract test language.
NOTE: Available at http://www.ttcn-3.org/index.php/downloads/standards.
[i.7] ETSI GS MEC 002: "Multi-access Edge Computing (MEC); Phase 2: Use Cases and
Requirements".
[i.8] ETSI GS MEC 010-1: "Mobile Edge Computing (MEC); Mobile Edge Management;
Part 1: System, host and platform management".
[i.9] ETSI GS MEC 010-2: "Multi-access Edge Computing (MEC); MEC Management;
Part 2: Application lifecycle, rules and requirements management".
[i.10] ETSI GS MEC 011: "Multi-access Edge Computing (MEC); Edge Platform Application
Enablement".
[i.11] ETSI GS MEC 013: "Multi-access Edge Computing (MEC); Location API".
ETSI
7 ETSI GR MEC-DEC 025 V2.1.1 (2019-06)
[i.12] ETSI GS MEC 014: "Mobile Edge Computing (MEC); UE Identity API".
[i.13] ETSI GS MEC 015: "Mobile Edge Computing (MEC); Bandwidth Management API".
[i.14] ETSI GS MEC 016: "Multi-access Edge Computing (MEC); UE application interface".
[i.15] ETSI Test Description Language.
NOTE: Available at https://tdl.etsi.org/index.php/downloads.
[i.16] ETSI GS MEC 001: "Multi-access Edge Computing (MEC); Terminology".
3 Definition of terms, symbols and abbreviations
3.1 Terms
For the purposes of the present document, the terms given in ETSI GS MEC 001 [i.16] and the following apply:
certification/compliance assessment: major goal of a compliance assessment is to ensure the interoperability of
implementations, and the conformance of implementations to the standard
conformance testing: purpose of conformance testing is to determine to what extent a single implementation of a
particular standard conforms to the individual requirements of that standard
interoperability testing: purpose of interoperability testing is to prove that end-to-end functionality between (at least)
two communicating systems is as required by the standard(s) on which those systems are based
Test Case (TC): complete and independent specification of the actions required to achieve a specific Test Purpose
NOTE: TCs are written in testing languages, e.g. TTCN-3.
Test Descriptions (TD): specify the sequence of actions required to realize the verdict identified in the TP and are
primarily intended for use in interoperability test specifications
NOTE: However, in some instances, particularly where there is a considerable difference in complexity between
the TPs and the TCs, it is worthwhile adding TDs as an extra design stage in a conformance test
specification.
Test Purpose (TP): should be written for each potential test of each identified requirement
NOTE: A TP defines in broad terms what the goal of a particular test should be. A TP is defined in prose.
test suite: collection of Test Cases
testing framework: provides guidance for development of conformance and interoperability test strategies, test
systems and the resulting test specifications
3.2 Symbols
Void.
3.3 Abbreviations
For the purposes of the present document, the abbreviations given in ETSI GS MEC 001 [i.16] and the following apply:
API Application Programming Interface
ATM Abstract Test Method
ATS Abstract Test Suite
BWMS BandWidth Management Service
CON CONformance
ETSI
8 ETSI GR MEC-DEC 025 V2.1.1 (2019-06)
CRS Conformance Requirement Statements
DUT Device Under Test
FUT Function Under Test
HTTP HyperText Transfer Protocol
HTTPS HyperText Transfer Protocol Secure
ICS Implementation Conformance Statement
IFS Interoperability Feature Statement
IOP InterOPerability
IUT Implementation Under Test
MEH MEC Host
OAM Operations And Maintaince
PDU Packet Data Unit
PICS Protocol Implementation Conformance Statement
PLMN Public Land Mobile Network
RAB Radio Access Bearer
RNI Radio Network Information
RNIS RNI Service
RP Reference Point
SAQ Service Availability Query
SUT System Under Test
TC Test Case
TCP Transmission Control Protocol
TDL Test Description Lanaguage
TP Test Purpose
TSS Test Suite Structure
TTCN Testing and Test Control Notation
URI Uniform Resource Identifier
4 Testing Methodology Guidelines for MEC
4.1 Introduction
Clause 4 provides:
• Identification of the implementations under test (IUT) for conformance testing and the device under test
(DUTs) for interoperability, i.e. answering the question "what is to be tested".
• Definition of the applicable test procedures, i.e. answering the question "how is it to be tested".
• Definition of the procedure for development of test specifications and deliverables (for instance: Test Purposes
(TP) in case of conformance testing and Test Descriptions (TD) in case of interoperability testing,
documentation, etc.).
The MEC testing framework contains:
• a documentation structure:
- catalogue of capabilities/features/functions (PICS or IFS);
- Test Suite Structure (TSS);
- individual tests in the form of TPs (Conformance) or TDs (Interoperability);
• a methodology linking the individual elements of a test specification together:
- style guidelines and examples;
- naming conventions;
- a structured notation for TPs or TDs.
ETSI
9 ETSI GR MEC-DEC 025 V2.1.1 (2019-06)
4.2 Basic concepts for conformance and interoperability testing
Conformance Testing and Interoperability Testing are the two main and complementary testing methodologies to test
devices implementing standardized services [i.2]. These two testing methodologies also apply to MEC.
The basic concepts for Conformance Testing and Interoperability Testing are defined as follows:
• Conformance Testing can show that a product correctly implements and meets the requirements in the ETSI
ISG MEC standards, which will include testing protocol message contents and formats as well as the permitted
sequence of messages for the interfaces defined by ETSI ISG MEC standards.
• Interoperability Testing can demonstrate that a product will work with other alike products. It proves that
end-to-end functionality between (at least) two functional elements is as required by the ETSI MEC
standards on which those functions are based.
For more details about the basic concepts for conformance and interoperability testing, please refer to clause 4.1 of
ETSI GS NFV-TST 002 [i.2].
4.3 Conformance Test Specifications
4.3.1 Introduction
Clause 4.3 explains how to apply the MEC conformance testing methodology in order to properly produce MEC
conformance test specifications.
The conformance testing can show that a product correctly implements a particular standardized protocol, that is, it
establishes whether or not the implementation under test meets the requirements specified for the protocol itself.
EXAMPLE: The scope of the testing is on protocol message content, format as well as the permitted sequences
of messages. In that context, tests are performed at open standardized interfaces that are not
(usually) accessible to an end user, and executed by a dedicated test system that has full control of
the system under test and the ability to observe all incoming and out coming communications; the
high degree of control of the test system over the sequence and contents of the protocol messages
allows to test both valid and invalid behaviour.
Figure 4.3.1-1: Conformance testing
Conformance test specifications should be produced following the methodology described in ISO/IEC 9646-1 [i.5]. In
summary, this methodology begins with the collation and categorization of the features and options to be tested into a
tabular form which is normally referred to as the "Implementation Conformance Statement" (ICS). All implemented
capabilities supported by the Implementation Under Test (IUT) are listed by the implementer in the ICS, so that the
tester knows which options have to be tested. This ensures that complete coverage is obtained.
The next step is to collect the requirements from the specification that is tested. For each requirement, one or more tests
should be identified and classified into a number of groups which will provide a structure to the overall test suite (TSS).
A brief Test Purpose (TP) should then be written for each identified test and this should make it clear what is to be
tested but not how this should be done. Finally, a detailed Test Case (TC) is written for each TP. In the interests of test
automation, TCs are usually combined into an Abstract Test Suite (ATS) using a specific testing language such as
TTCN-3 or others. The TCs in the ATS are then "Verified" against a number of IUTs for correct operation according to
some agreed procedures, before being released for use by the industry.
In summary, the MEC Conformance Testing methodology consists of:
• Selection of Implementations Under Test (IUT).
• Identification of reference points.
ETSI
10 ETSI GR MEC-DEC 025 V2.1.1 (2019-06)
• Development of test specifications, which includes:
- Development of "Implementation Conformance Statements" (ICS).
- Development of "Test Suite Structure and Test Purposes" (TSS&TP).
- Development of "Abstract Test Suite" (ATS).
4.3.2 Test architecture
4.3.2.1 Selection of Implementation Under Test
4.3.2.1.1 Definition
The "Implementation Under Test" (IUT) is a protocol implementation considered as an object for testing. This means
that the test process will focus on verifying the compliance of this protocol implementation (IUT) with requirements set
up in the related base standard. An IUT normally is implemented in a "System Under Test" (SUT). For testing, a SUT is
connected to a test system over at least a single interface. Such an interface is identified as "Reference Point" (RP) in
the present document. Further details on RPs are presented in clause 6.
NOTE: Other interfaces between the test system and the IUT may be used to control the behaviour of the IUT
during the test process.
Figure 4.3.2.1.1-1 shows the multi-access edge system reference architecture, see also clause 6 of ETSI
GS MEC 003 [i.3].
Figure 4.3.2.1.1-1: Multi-access edge system reference architecture
ETSI
11 ETSI GR MEC-DEC 025 V2.1.1 (2019-06)
4.3.2.1.2 MEC IUTs and Reference Points
MEC IUTs and Reference Points are collected in tables as shown in the example below.
Table 4.3.2.1.2-1: Example of MEC IUT assessment
IUT Reference Reference Points Notes
Multi-access edge application (MEC Clause 6 of ETSI Mp1
app) GS MEC 003 [i.3]
Multi-access platform (MEC plat) Clause 6 of ETSI Mp1, Mp2, Mm5 Mp2 and Mm5 out of scope of testing
GS MEC 003 [i.3]
These tables need to be amended in the following cases:
• A new node or entity is defined in the base specifications.
• A new interface is defined in the base specifications between any of the existing nodes or entities.
4.3.3 Development of Conformance Test Specifications
4.3.3.1 Implementation Conformance Statement (ICS)
The purpose of an ICS is to identify those standardized functions which an IUT is required to support, those which are
optional and those which are conditional on the presence of other functions. It helps to provide a means for selection of
the suite of tests which will subsequently be developed.
In addition, the ICS can be used as a proforma for identifying which functions an IUT will support when performing
conformance testing. The purpose of this ICS proforma is to provide a mechanism whereby a MEC implementation
supplier may provide information about the implementation in a standardized manner. The information in an ICS is
usually presented in tabular form as recommended in ISO/IEC 9646-7 [i.4].
The ICS can be considered as a set of "switches" which specify the capability of supporting the requirement in base
standards to be tested. It is possible that with different choices in an ICS proforma, several different set of TPs will be
necessary.
In clauses 5 "Requirement assessment" and 6 "Architecture assessment" assessments are made on whether
requirements, features, components and other capabilities are required according to a referenced GS, or in order to
achieve compliance. This assessment provides the following options:
m mandatory - the capability is required to be supported.
o optional - the capability may, or may not, be supported.
c.i conditional - the requirement on the capability ("m", "o", "x" or "n/a") depends on the support of
other optional or conditional items. "i" is an integer identifying a unique conditional status
expression which is defined immediately following the table.
n/a not applicable - in the given context, it is not possible to use the capability.
x prohibited (excluded) - there is a requirement not to use this capability in the given context.
o.i qualified optional - for mutually exclusive or selectable options from a set: "i" is an integer which
identifies a unique group of related optional items and the logic of their selection which is defined
immediately following the table.
An example is shown in the Table 4.3.3.1-1.
ETSI
12 ETSI GR MEC-DEC 025 V2.1.1 (2019-06)
Table 4.3.3.1-1: Roles
Item Role Reference Status
1 User 1.3 o.1
2 Network 1.2 o.1
o.1: At least one item should be supported.
4.3.3.2 Test Suite Structure & Test Purposes (TSS&TP)
4.3.3.2.1 Introduction
A test purpose is a prose description of a well-defined objective of testing. Applying to conformance testing, it focuses
on a single conformance requirement or a set of related conformance requirements from the base standards.
Several types of presentation of the test purposes exist. These presentations are combining text with graphical
presentations, mainly tables, and include sometimes message sequence charts. The present document presents a
proposed table template to write test purposes with recommendations concerning the wording and the organization of
the test purposes.
There are usually numerous test purposes, which need to be organized in structured groups. The organization of the test
purposes in groups is named "Test Suite Structure".
The development of the test purposes follows the analysis of the conformance requirements, clearly expressed in the
base standards. Furthermore, the analysis of a base standard leads to the identification of different groups of
functionalities, which are used to define the first levels of the test suite structure.
4.3.3.2.2 Test Suite Structure
Defining the test suite structure consists of grouping the test purposes according to different criteria like for instance:
• The functional groups and sub-groups of procedures in the base standard, from which the requirement of the
test purpose is derived.
• The category of test applying to the test purposes, for instance:
- valid behaviour test;
- invalid behaviour test;
- timer test;
- etc.
Usually the identification of the different functional groups of procedures leads to the definition of the top levels of the
TSS. Then further levels at the bottom of the TSS is used to group test purposes belonging to the same type of test.
Table 4.3.3.2.2-1 shows an example of a two level TSS used in the TSS&TP for the MEC system.
Table 4.3.3.2.2-1: Example of test suite structure
TP_____
= root MEC MEC
= group App MEC application
Plat MEC platform
= sub- group Mp1 Reference point Multi-access platform 1
Mp2 Reference point Multi-access platform 2
= feature SAQ Service Availability Query
= type of testing BI Invalid Behaviour tests
BO Inopportune Behaviour tests
BV Valid Behaviour tests
= sequential number 001 to 999
ETSI
13 ETSI GR MEC-DEC 025 V2.1.1 (2019-06)
Each feature is characterized by a number of functional requirements, some of them explicitly identified in the existing
MEC specifications. Additionally, MEC 025 has identified further requirements, derived from the specification of the
MEC services REST APIs. These requirements are identified in the rest of the document adopting a specific numbering
schema, as follows: MEC025.... This schema is described in Table 4.3.3.2.2-2.
Table 4.3.3.2.2-2: Example of functional requirement identifier
MEC025...
= interface-name Mm1 Reference point
= feature AppPkgm Application Package Management
= sequential number 001 to 999
4.3.3.2.3 Test Purpose
A test purpose is an informal description of the expected test behaviour. As such it is written in prose.
When needed to clarify the TP, it is helpful to add some graphical presentations, mainly tables, and include message
sequence charts.
In order to increase the readability of the TP, the following recommendations should be followed:
• Each TP should contain:
- The TP header, which contains the TP identifier, the TP objective and the external references (ICS, and
base standard).
- The behaviour part, which contains the test behaviour description. This part can be optionally divided in
the three following parts, in order to increase the readability:
the initial conditions;
the expected behaviour;
the final conditions.
• Each TP should be written according to TDL-TO [i.15].
ETSI
14 ETSI GR MEC-DEC 025 V2.1.1 (2019-06)
Table 4.3.3.2.3-1: Description of the fields of the TP
TP Header
TP ID The TP identifier identifies uniquely the test purposes. In order to ensure the
uniqueness of the TP identifier, it follows a naming convention.
The more useful and straightforward naming convention consists of using the test suite
structure, to form the first part of the TP identifier. Then the final part consists of a
number to identify the TP order within a TP group.
The TP identifier is formed by the abbreviation "TP", followed by abbreviation
representing the group of the following TSS levels, ending with a number representing
the TP order. Each field of the TP identifier is separated by a "_".
A TP identifier, following the TP naming convention of the table could be for instance
TP_MEC_MEApp_MP1_BV_001.
The TP numbering uses two digits for presentation, and starts with 01 rather than with
00. Exceeding 99 TPs per group is not recommended. In such a case, it is rather
recommended to create sub-groups, in order to keep clarity in the Test Suite Structure.
Test objective The test objective clearly indicates which requirement is intended to be tested in the test
purpose. This part eases the understanding of the TP behaviour. This also eases the
identification of the requirements, which were used as a basis for the test purpose.
It is recommended to limit the length of the test objective to one sentence.
Reference In the reference row, the TP writer indicates, in which clauses of the protocol standards,
the requirement is expressed. This information is critical, because it justifies the
existence and the behaviour of the TP.
The reference row may refer to several clauses. When the clause containing the
requirement is big (for instance, more than 1/2 page), it is recommended to indicate the
paragraph of the clause where the requirement was identified.
The reference to the base standard actually is precise enough to enable the TP reader
to identify quickly and precisely the requirement.
Config Id The pointer to the applicable test configuration. A test configuration defines how the test
system connects to the SUT.
ICS selection The ICS selection row contains a Boolean expression, made of ICS parameters. It is
recommended to use ICS acronym, which clearly identify the role of the ICS.
TP Behaviour
Initial conditions The initial conditions define in which initial state the IUT has to be to apply the actual
TP. In the corresponding Test Case, when the execution of the initial condition does not
succeed, it leads to the assignment of an Inconclusive verdict.
Expected behaviour Definition of the events, which are parts of the TP objective, and the IUT are expected
(TP body) to perform in order to conform to the base specification. In the corresponding Test
Case, Pass or Fail verdicts can be assigned there.
Final conditions Definition of the events that the IUT is expected to perform or mandated not to perform,
(optional) according to the base standard and following the correct execution of the actions in the
expected behaviour above. In the corresponding Test Case, the execution of the final
conditions is evaluated for the assignment of the final verdict.
4.3.3.2.4 Conventions for the expected behaviour in Test Purposes
In order to increase the comprehension of the test purposes, a number of conventions regarding data has been defined.
The data exchanged in the test execution is expressed (within the Expected Behaviour field of the TP) in three different
ways, depending on their types among:
• Fixed values.
• Configurable values.
• Irrelevant values.
Fixed values, as per definition, are values that never change within or among any executions of test. In this case a literal
value should be used and wrapped in quotes (i.e. "401 Unauthorized").
Configurable values may change among different executions of the tests. In this case, the value should be defined in
capital letters, separating all the words by the underscore symbol (i.e. SUBSCRIPTION_HREF_URI).
Irrelevant values may be required in the exchange of the messages but have no impact on the test outcome and verdict.
In this case, if the attribute name is relevant to the reader, the keyword "attribute" should be used (i.e.: attribute ID) or
the keyword "set" (ID set to "any value").
ETSI
15 ETSI GR MEC-DEC 025 V2.1.1 (2019-06)
When a message is meant not to have a specific field the name of the field is prepended with "not". In this case the Test
System should send a message without that specific field.
Datatype naming done according to the OpenAPIs (when possible) and references are added. When a collection of
object is specified without an explicit name in the OpenAPIs, the TP Expected Behaviour should create a name in the
form of "List" according to the datatype of objects in the collection.
In every data, object or message instance, the test should present only fields on which requirements or provisions are
set.
In the TP, the general rules of HTTP message conformance are not explicitly reported each time but abstracted with the
usage of identifiers such as vGET, vPOST, etc. Each one of these identifiers refers to HTTP requests where the
following headers are conformant to the relevant MEC specification:
• Accept.
• Authorization.
• Content type.
4.3.3.3 Abstract Test Method (ATM)
4.3.3.3.1 Methodology
The Abstract Test Method (ATM) is defined in ISO/IEC 9646-1 [i.5] and supports a wide range of approaches for
testing including e.g. the TTCN-3 abstract test language [i.6].
An abstract protocol tester presented in Figure 4.3.3.3.1-1 emulates a peer IUT of the same layer/the same entity. This
type of test architecture provides a situation of communication which is equivalent to real operation between MEC
systems. The MEC test system will simulate valid and invalid protocol behaviour, and will analyse the reaction of the
IUT. Then the test verdict, e.g. pass or fail, will depend on the result of this analysis. Thus, this type of test architecture
enables to focus the test objective on the IUT behaviour only.
In order to access an IUT, the corresponding abstract protocol tester needs to use lower layers to establish a proper
connection to the system under test (SUT) over a physical link (Lower layers link).
Figure 4.3.3.3.1-1: Generic abstract protocol tester
The Protocol Data Units (PDUs) are the messages exchanged between the IUT and the abstract protocol tester as
specified in the base standard of the IUT. These PDUs are used to trigger the IUT and to analyse the reaction from the
IUT on a trigger. Comparison of the result of the analysis with the requirements specified in the base standard allows to
assign the test verdict.
For instance, to test the MEC app, the abstract protocol tester will emulate the MEC platform and use HTTP PDUs,
TCP and IPV4/IPV6 protocol in the transport and networking layer and ethernet in the access layer.
ETSI
16 ETSI GR MEC-DEC 025 V2.1.1 (2019-06)
4.3.3.3.2 Abstract PDU Transport Protocol
The application services described in the MEC specifications and addressed by this testing framework specify RESTful
services over HTTPS as the PDUs transport protocol. Some specifications briefly mention that other protocols also may
be used when concerns such as low latency or high-volume data transfers are a factor to consider.
To foster tests re-use and accommodate future evolutions of the MEC specifications on what concerns PDUs transport,
conformance test suites should be designed to be able to use different transport protocols; the default implementation
should support the protocol required ma
...








Questions, Comments and Discussion
Ask us and Technical Secretary will try to provide an answer. You can facilitate discussion about the standard in here.
Loading comments...