Information technology — Internet of media things — Part 1: Architecture

This document describes the architecture of systems for the internet of media things.

Technologies de l'information — Internet des objets media — Partie 1: L’architecture IoMT

General Information

Status
Withdrawn
Publication Date
02-Mar-2020
Current Stage
9599 - Withdrawal of International Standard
Completion Date
30-Mar-2022
Ref Project

Relations

Buy Standard

Standard
ISO/IEC 23093-1:2020 - Information technology -- Internet of media things
English language
20 pages
sale 15% off
Preview
sale 15% off
Preview

Standards Content (Sample)

INTERNATIONAL ISO/IEC
STANDARD 23093-1
First edition
2020-02
Information technology — Internet of
media things —
Part 1:
Architecture
Technologies de l'information — Internet des objets media —
Partie 1: L’architecture IoMT
Reference number
ISO/IEC 23093-1:2020(E)
©
ISO/IEC 2020

---------------------- Page: 1 ----------------------
ISO/IEC 23093-1:2020(E)

COPYRIGHT PROTECTED DOCUMENT
© ISO/IEC 2020
All rights reserved. Unless otherwise specified, or required in the context of its implementation, no part of this publication may
be reproduced or utilized otherwise in any form or by any means, electronic or mechanical, including photocopying, or posting
on the internet or an intranet, without prior written permission. Permission can be requested from either ISO at the address
below or ISO’s member body in the country of the requester.
ISO copyright office
CP 401 • Ch. de Blandonnet 8
CH-1214 Vernier, Geneva
Phone: +41 22 749 01 11
Fax: +41 22 749 09 47
Email: copyright@iso.org
Website: www.iso.org
Published in Switzerland
ii © ISO/IEC 2020 – All rights reserved

---------------------- Page: 2 ----------------------
ISO/IEC 23093-1:2020(E)

Contents Page
Foreword .v
Introduction .vi
1 Scope . 1
2 Normative references . 1
3 Terms and definitions . 1
3.1 Internet of media things terms . 1
3.2 Internet of things terms . 3
4 Architecture . 5
5 Use cases . 5
5.1 General . 5
5.2 Smart spaces: Monitoring and control with network of audio-video cameras . 6
5.2.1 General. 6
5.2.2 Human tracking with multiple network cameras . 6
5.2.3 Automatic title generation . 7
5.2.4 Intelligent firefighting with IP surveillance cameras . 7
5.2.5 Networked digital signs for customized advertisement . 7
5.2.6 Digital signage and second screen use . 8
5.2.7 Self-adaptive quality of experience for multimedia applications . 8
5.2.8 Ultra-wide viewing video composition . 8
5.2.9 Face recognition to evoke sensorial actuations . 8
5.2.10 Automatic video clip generation by detecting event information . 8
5.2.11 Temporal synchronization of multiple videos for creating 360° or multiple
view video . 9
5.2.12 Intelligent similar content recommendations using information from
IoMT devices . 9
5.3 Smart spaces: Multi-modal guided navigation . 9
5.3.1 General. 9
5.3.2 Blind person assistant system . 9
5.3.3 Personalized navigation by visual communication .10
5.3.4 Personalized tourist navigation with natural language functionalities .10
5.3.5 Smart identifier: Face recognition on smart glasses .11
5.3.6 Smart advertisement: QR code recognition on smart glasses .11
5.4 Smart audio/video environments in smart cities .12
5.4.1 General.12
5.4.2 Smart factory: Car maintenance assistance A/V system using smart glasses .12
5.4.3 Smart museum: Augmented visit using smart glasses .12
5.4.4 Smart house: Light control, vibrating subtitle, olfaction media content
consumption, odour image recognizer .13
5.4.5 Smart car: Head-light adjustment and speed monitoring to provide
automatic volume control .14
5.5 Smart multi-modal collaborative health .14
5.5.1 General.14
5.5.2 Increasing patient autonomy by remote control of left-ventricular assisted
devices .14
5.5.3 Diabetic coma prevention by monitoring networks of in-body/near body
sensors . .15
5.5.4 Enhanced physical activity with smart fabrics networks .15
5.5.5 Medical assistance with smart glasses.15
5.5.6 Managing healthcare information for smart glasses .16
5.6 Blockchain usage for IoMT transactions authentication and monetizing .17
5.6.1 General.17
5.6.2 Reward function in IoMT people counting by using blockchains .17
© ISO/IEC 2020 – All rights reserved iii

---------------------- Page: 3 ----------------------
ISO/IEC 23093-1:2020(E)

5.6.3 Content authentication with blockchains .17
Annex A (informative) Mapping of the components between IoMT and IoT reference
architectures .18
Bibliography .20
iv © ISO/IEC 2020 – All rights reserved

---------------------- Page: 4 ----------------------
ISO/IEC 23093-1:2020(E)

Foreword
ISO (the International Organization for Standardization) and IEC (the International Electrotechnical
Commission) form the specialized system for worldwide standardization. National bodies that
are members of ISO or IEC participate in the development of International Standards through
technical committees established by the respective organization to deal with particular fields of
technical activity. ISO and IEC technical committees collaborate in fields of mutual interest. Other
international organizations, governmental and non-governmental, in liaison with ISO and IEC, also
take part in the work.
The procedures used to develop this document and those intended for its further maintenance are
described in the ISO/IEC Directives, Part 1. In particular, the different approval criteria needed for
the different types of document should be noted. This document was drafted in accordance with the
editorial rules of the ISO/IEC Directives, Part 2 (see www .iso .org/ directives).
Attention is drawn to the possibility that some of the elements of this document may be the subject
of patent rights. ISO and IEC shall not be held responsible for identifying any or all such patent
rights. Details of any patent rights identified during the development of the document will be in the
Introduction and/or on the ISO list of patent declarations received (see www .iso .org/ patents) or the IEC
list of patent declarations received (see http:// patents .iec .ch).
Any trade name used in this document is information given for the convenience of users and does not
constitute an endorsement.
For an explanation of the voluntary nature of standards, the meaning of ISO specific terms and
expressions related to conformity assessment, as well as information about ISO's adherence to
the World Trade Organization (WTO) principles in the Technical Barriers to Trade (TBT), see
www .iso .org/ iso/ foreword .html.
This document was prepared by Joint Technical Committee ISO/IEC JTC 1, Information technology,
Subcommittee SC 29, Coding of audio, picture, multimedia and hypermedia information.
A list of all parts in the ISO/IEC 23093 series can be found on the ISO website.
Any feedback or questions on this document should be directed to the user’s national standards body. A
complete listing of these bodies can be found at www .iso .org/ members .html.
© ISO/IEC 2020 – All rights reserved v

---------------------- Page: 5 ----------------------
ISO/IEC 23093-1:2020(E)

Introduction
The ISO/IEC 23093 series provides an architecture and specifies application programming interfaces
(APIs) and compressed representation of data flowing between media things.
The APIs for the media things facilitate discovering other media things in the network, connecting
and efficiently exchanging data between media things. The APIs also provide means for supporting
transaction tokens in order to access valuable functionalities, resources, and data from media things.
Media things related information consists of characteristics and discovery data, setup information
from a system designer, raw and processed sensed data, and actuation information. The ISO/IEC 23093
series specifies data formats of input and output for media sensors, media actuators, media storages,
media analysers, etc. Sensed data from media sensors can be processed by media analysers to produce
analysed data, and the media analysers can be cascaded in order to extract semantic information.
This document does not specify how the process of sensing and analysing is carried out but specifies
the interfaces between the media things. This document describes the architecture of systems for the
internet of media things.
The International Organization for Standardization (ISO) and International Electrotechnical
Commission (IEC) draw attention to the fact that it is claimed that compliance with this document may
involve the use of a patent.
ISO and IEC take no position concerning the evidence, validity and scope of this patent right. The holder
of this patent right has assured ISO and IEC that he/she is willing to negotiate licences under reasonable
and non-discriminatory terms and conditions with applicants throughout the world. In this respect, the
statement of the holder of this patent right is registered with ISO and IEC. Information may be obtained
from the patent database available at www .iso .org/ patents.
Attention is drawn to the possibility that some of the elements of this document may be the subject
of patent rights other than those in the patent database. ISO and IEC shall not be held responsible for
identifying any or all such patent rights.
vi © ISO/IEC 2020 – All rights reserved

---------------------- Page: 6 ----------------------
INTERNATIONAL STANDARD ISO/IEC 23093-1:2020(E)
Information technology — Internet of media things —
Part 1:
Architecture
1 Scope
This document describes the architecture of systems for the internet of media things.
2 Normative references
There are no normative references in this document.
3 Terms and definitions
For the purposes of this document, the following terms and definitions apply.
ISO and IEC maintain terminological databases for use in standardization at the following addresses:
— ISO Online browsing platform: available at https:// www .iso .org/ obp
— IEC Electropedia: available at http:// www .electropedia .org/
3.1 Internet of media things terms
3.1.1
audio
anything related to sound in terms of receiving, transmitting or reproducing it or of its specific
frequency
3.1.2
camera
special form of an image capture device that senses and captures photo-optical signals
3.1.3
display
visual representation of the output of an electronic device or the portion of an electronic device that
shows this representation, as a screen, lens or reticle
3.1.4
gesture
movement or position of the hand, arm, body, head or face that is expressive of an idea, opinion,
emotion, etc.
3.1.5
haptics
input or output device that senses the body's movements by means of physical contact with the user
3.1.6
image capture device
device which is capable of sensing and capturing acoustic, electrical or photo-optical signals of a
physical entity that can be converted into an image
© ISO/IEC 2020 – All rights reserved 1

---------------------- Page: 7 ----------------------
ISO/IEC 23093-1:2020(E)

3.1.7
internet of media things
IoMT
special subset of IoT (3.2.9) whose main functionalities are related to media processing
3.1.8
IoMT device
IoT (3.2.9) device that contains more than one MThing (3.1.12)
3.1.9
IoMT system
MSystem
IoT (3.2.9) system whose main functionality is related to media processing
3.1.10
loudspeaker
electroacoustic device, connected as a component in an audio system, generating audible acoustic waves
3.1.11
media
data that can be rendered, including audio, video, text, graphics, images, haptic and tactile information
Note 1 to entry: These data can be timed or non-timed.
3.1.12
media thing
MThing
thing (3.2.20) capable of sensing, acquiring, actuating, or processing of media or metadata
3.1.13
media token
virtual token for accessing functionalities, resources and data of media things
3.1.14
microphone
entity capable of capture and transform acoustic waves into changes in electric currents or voltage,
used in recording or transmitting sound
3.1.15
media wearable
MWearable
MThing (3.1.12) intended to be located near, on or in an organism
3.1.16
motion
action or process of changing place or position
3.1.17
natural user interface
NUI
system for human-computer interaction that the user operates through intuitive actions related to
natural, everyday human behaviour
3.1.18
presentation
act of producing human recognizable output of rendered media
2 © ISO/IEC 2020 – All rights reserved

---------------------- Page: 8 ----------------------
ISO/IEC 23093-1:2020(E)

3.2 Internet of things terms
3.2.1
actuator
component which conveys digital information to effect a change of some property of a physical entity
3.2.2
capability
characteristic or property of an entity that can be used to describe its state, appearance or other aspects
EXAMPLE An entity type, address information, telephone number, a privilege, a MAC address, a domain
name are possible attributes, see Reference [1].
3.2.3
component
modular, deployable and replaceable part of a system that encapsulates implementations
Note 1 to entry: A component may expose or use interfaces (local or on a network) to interact with other entities,
see Reference [2]. A component which exposes or uses network interfaces is called an endpoint.
3.2.4
digital entity
any computational or data element of an IT-based system
Note 1 to entry: It may exist as a service based in a data centre or cloud, or a network element or a gateway.
3.2.5
discovery
service to find unknown resources/entities/services based on a rough specification of the desired result
Note 1 to entry: It may be utilized by a human or another service; credentials for authorization are considered
when executing the discovery, see Reference [4].
3.2.6
entity
anything (physical or non-physical) having a distinct existence
3.2.7
identifier
information that unambiguously distinguishes one entity (3.2.6) from another one in a given
identity context
3.2.8
identity
characteristics determining who or what a person or thing is
3.2.9
internet of things
IoT
infrastructure of interconnected objects, people, systems and information resources together with
intelligent services to allow them to process information of the physical and the virtual world and to react
3.2.10
interface
shared boundary between two functional components, defined by various characteristics pertaining
to the functions, physical interconnections, signal exchanges, and other characteristics, as appropriate
Note 1 to entry: See Reference [5].
© ISO/IEC 2020 – All rights reserved 3

---------------------- Page: 9 ----------------------
ISO/IEC 23093-1:2020(E)

3.2.11
IoT system
system that is comprised of functions that provide the system the capabilities for identification, sensing,
actuation, communication and management, and applications and services to a user
Note 1 to entry: See Reference [7].
3.2.12
network
entity that connects endpoints, sources to destinations, and may itself act as a value-added element in
the IoT system or services
3.2.13
process
procedure to carry out operations on data
3.2.14
physical entity
thing (3.2.20) that is discrete, identifiable and observable, and having material existence in real world
3.2.15
reference architecture
description of common features, common vocabulary, guidelines, interrelations and interactions among
the entities, and a template for an IoT architecture
3.2.16
resource
any element of a data processing system needed to perform required operations
Note 1 to entry: See Reference [8].
3.2.17
sensor
device that observes and measures a physical property of a natural phenomenon or man-made process
and converts that measurement into a signal
Note 1 to entry: A signal can be electrical, chemical, etc., see Reference [9].
3.2.18
service
distinct part of the functionality that is provided by an entity through interfaces
Note 1 to entry: See Reference [10].
3.2.19
storage
capacity of a digital entity to store information subject to recall or the components of a digital entity in
which such information is stored
3.2.20
thing
any entity that can communicate with other entities
3.2.21
user
human or any digital entity that is interested in interacting with a particular physical object
3.2.23
visual
any object perceptible by the sense of sight
4 © ISO/IEC 2020 – All rights reserved

---------------------- Page: 10 ----------------------
ISO/IEC 23093-1:2020(E)

4 Architecture
The global IoMT architecture is presented in Figure 1, which identifies a set of interfaces, protocols and
associated media-related information representations related to:
— user commands (setup information) between a system manager and an MThing, with reference to
interface 1.
— user commands (setup information) forwarded by an MThing to another MThing, possibly in a
modified form (e.g., subset of 1), with reference to interface 1’.
— sensed data (raw or processed data) (compressed or semantic extraction) and actuation information,
with reference to Interface 2.
— wrapped interface 2 (e.g., for transmission), with reference to interface 2’.
— MThing characteristics, discovery, with reference to interface 3.
Figure 1 — IoMT architecture
This IoMT architecture can be mapped to the IoT reference architecture (Reference [4]) as shown in
Annex A.
5 Use cases
5.1 General
MPEG identified 27 use-cases for IoMT; they are structured in the following five main categories:
— Smart spaces: Monitoring and control with network of audio-video cameras (see 5.2)
— human tracking with multiple network cameras
— automatic title generation
— intelligent firefighting with IP surveillance cameras
— networked digital signs for customized advertisement
— digital signage and second screen use
— self-adaptive quality of experience for multimedia applications
— ultra-wide viewing video composition
— face recognition to evoke sensorial actuations
— automatic video clip generation by detecting event information
— temporal synchronization of multiple videos for creating 360° or multiple view video
© ISO/IEC 2020 – All rights reserved 5

---------------------- Page: 11 ----------------------
ISO/IEC 23093-1:2020(E)

— intelligent similar content recommendations using information from IoMT devices
— Smart spaces: Multi-modal guided navigation (see 5.3)
— blind person assistant system
— personalized navigation by visual communication
— personalized tourist navigation with natural language functionalities
— smart identifier: face recognition on smart glasses
— smart advertisement: QR code recognition on smart glasses
— Smart audio/video environments in smart cities (see 5.4)
— smart factory: car maintenance assistance A/V system using smart glasses
— smart museum: augmented visit museum using smart glasses
— smart house: light control, vibrating subtitle, olfaction media content consumption
— smart car: head-light adjustment and speed monitoring to provide automatic volume control
— Smart multi-modal collaborative health (see 5.5)
— increasing patient autonomy by remote control of left-ventricular assisted devices
— diabetic coma prevention by monitoring networks of in-body/near body sensors
— enhanced physical activity with smart fabrics networks
— medical assistance with smart glasses
— managing healthcare information for smart glass
— Blockchain usage for IoMT transactions authentication and monetizing (see 5.6)
— reward function in IoMT by using blockchains
— content authentication with blockchains
5.2 Smart spaces: Monitoring and control with network of audio-video cameras
5.2.1 General
The large variety of sensors, actuators, displays and computational elements acting in our day-by-day
professional and private space in order to provide us with better and easier accessible services lead to
11 use cases of interest for IoMT, mainly related to the processing of video information.
5.2.2 Human tracking with multiple network cameras
Because urban growth is today accompanied by an increase in crimes rate (e.g., theft, vandalism), many
local authorities consider surveillance systems as a possible tool to fight this phenomenon. A city video
surveillance system is an IoMT system that includes a set of IP surveillance cameras, a storage unit and
a human tracker unit.
A particular IP surveillance camera captures audio-video data and send them to both the storage and
the human tracker unit. When the human tracker detects a person in the visible area, it traces the
person and extract the moving trajectory.
6 © ISO/IEC 2020 – All rights reserved

---------------------- Page: 12 ----------------------
ISO/IEC 23093-1:2020(E)

If the person gets out of the visual scope of the first IP camera but stay in the area protected by the
city video surveillance system, another IP camera from this system can take over the control and keep
capturing A/V data of the corresponding person.
If the person gets out of the protected area, for example the person enters into a commercial centre, then
the city system searches whether this commercial centre is also equipped with a video surveillance
system. Should this be the case, the city video surveillance system sets up a communication with the
commercial centre video surveillance system in order to allow another IP camera from the commercial
centre video surveillance centre to keep capturing A/V data of the corresponding person.
In both cases, the specific descriptors (e.g., moving trajectory information, appearance information,
media locations of detected moments) can be extracted and sent to the storage.
5.2.3 Automatic title generation
In the sustainable smart city of Seoul, IoMT cameras (smart CCTV) are deployed around the city. These
cameras are continuously capturing video (24 hours/7 days). When unusual events such as a violent
scene, crowd scene, theft scene or busking scene occurs, the title generator (event description generator)
generates a title for the video clip with time and place information in real-time. The generated title
is stored with the video clip in MStorage. As an example scenario, consider a CCTV capturing videos
(visual data), with time and GPS information. The title generator analyses the video stream, selects a
keyframe and combines time, GPS and keyframe to generate a formatted title. The captured video with
the generated title is sent to storage.
5.2.4 Intelligent firefighting with IP surveillance cameras
Figure 2 illustrates an example use-case of intelligent firefighting with IP surveillance cameras. In
this case, the fire station and the security manager can rapidly receive the fire/smoke detection alert,
thereby averting a potential fire hazard. Unlike conventional security systems, the outdoor scene
captured by intelligent IP surveillance cameras is immediately analysed and the fire/smoke incident is
automatically alerted to the fire station based on the analysed results of the captured scene.
Figure 2 — Example use-case of intelligent firefighting
5.2.5 Networked digital signs for customized advertisement
A camera can be either attached to or embedded in a digital screen displaying advertising content, so as
to be able to capture A/V data and send them to both a storage unit and a gaze tracking/ROI analysing
unit. When the gaze tracking/ROI analyser detects a person in front of the corresponding digital sign, it
starts to trace the eye position, calculates the corresponding region of interest on the currently played
advertisement, and deduces the person’s current interest (e.g., goods) on the advertisement. When the
person moves to the other digital sign, that new sign starts playing relevant advertisement according to
the estimated person’s interest data.
© ISO/IEC 2020 – All rights reserved 7

---------------------- Page: 13 ----------------------
ISO/IEC 23093-1:2020(E)

5.2.6 Digital signage and second screen use
This use case addresses the pedestrians who want to get additional information (e.g., product
information, characters, places) of content displayed on digital signs with his/her mobile phones (i.e.,
second screens), as illustrated in Figure 3.
Figure 3 — Display signage and second screen use-case
5.2.7 Self-adaptive quality of experience for multimedia applications
The self-adaptive multimedia application is an application working on wearable device with a
middleware providing optimal quality of services (QoS) performance for each application, according to
the static/dynamic status of the application and/or system resources.
The user initi
...

Questions, Comments and Discussion

Ask us and Technical Secretary will try to provide an answer. You can facilitate discussion about the standard in here.