Information technology — User interface — Gesture-based interfaces across devices and methods — Part 1: Framework

ISO/IEC 30113-1:2015 defines a framework and guidelines for gesture-based interfaces across devices and methods in supporting interoperability. NOTE Some of these devices include mice, touch screens, touch pads, 3D mice, joysticks, game controllers, wired gloves, depth-aware cameras, stereo cameras, Web cameras. ISO/IEC 30113-1:2015 does not define or require specific technology for recognizing gesture of users. It focuses on the description of a gesture and its functions for utilizing ICT systems. NOTE Operation of a physical keyboard is not addressed in this part of ISO/IEC 30113.

Technologies de l'information — Interface utilisateur — Interfaces fondés sur la gestuelle entre dispositifs et méthodes — Partie 1: Cadre

General Information

Status
Published
Publication Date
15-Apr-2015
Current Stage
9093 - International Standard confirmed
Completion Date
05-Nov-2020
Ref Project

Buy Standard

Standard
ISO/IEC 30113-1:2015 - Information technology -- User interface -- Gesture-based interfaces across devices and methods
English language
13 pages
sale 15% off
Preview
sale 15% off
Preview

Standards Content (Sample)

INTERNATIONAL ISO/IEC
STANDARD 30113-1
First edition
2015-04-15
Information technology — User
interface — Gesture-based interfaces
across devices and methods —
Part 1:
Framework
Technologies de l’information — Interface utilisateur — Interfaces
fondés sur la gestuelle entre dispositifs et méthodes —
Partie 1: Cadre
Reference number
ISO/IEC 30113-1:2015(E)
©
ISO/IEC 2015

---------------------- Page: 1 ----------------------
ISO/IEC 30113-1:2015(E)

COPYRIGHT PROTECTED DOCUMENT
© ISO/IEC 2015
All rights reserved. Unless otherwise specified, no part of this publication may be reproduced or utilized otherwise in any form
or by any means, electronic or mechanical, including photocopying, or posting on the internet or an intranet, without prior
written permission. Permission can be requested from either ISO at the address below or ISO’s member body in the country of
the requester.
ISO copyright office
Case postale 56 • CH-1211 Geneva 20
Tel. + 41 22 749 01 11
Fax + 41 22 749 09 47
E-mail copyright@iso.org
Web www.iso.org
Published in Switzerland
ii © ISO/IEC 2015 – All rights reserved

---------------------- Page: 2 ----------------------
ISO/IEC 30113-1:2015(E)

Contents Page
Foreword .iv
Introduction .v
1 Scope . 1
2 Conformance . 1
3 Terms and definitions . 1
4 Overview of gesture-based interface . 2
4.1 General . 2
4.2 User’s actions for gesture input. 2
4.3 Gesture input device . 3
4.4 ICT system . 3
4.5 Cultural Adaptability . 3
4.6 Accessibility . 3
5 Requirements and recommendations. 3
5.1 Activating/finishing a gesture . 3
5.2 Performing a gesture . 4
5.3 Feedback for confirming a gesture . 4
5.4 Feed forward . 4
5.5 Cancelling a gesture . 4
5.6 Criteria of gesture size . 4
5.7 Controlling the criteria . 4
5.8 Changing correspondence of a gesture to a gesture command . 5
5.9 Descriptions of individual gestures within the part . 5
Annex A (informative) Outline for describing the ISO/IEC 30113 series . 6
Bibliography .13
© ISO/IEC 2015 – All rights reserved iii

---------------------- Page: 3 ----------------------
ISO/IEC 30113-1:2015(E)

Foreword
ISO (the International Organization for Standardization) and IEC (the International Electrotechnical
Commission) form the specialized system for worldwide standardization. National bodies that are
members of ISO or IEC participate in the development of International Standards through technical
committees established by the respective organization to deal with particular fields of technical
activity. ISO and IEC technical committees collaborate in fields of mutual interest. Other international
organizations, governmental and non-governmental, in liaison with ISO and IEC, also take part in the
work. In the field of information technology, ISO and IEC have established a joint technical committee,
ISO/IEC JTC 1.
The procedures used to develop this document and those intended for its further maintenance are
described in the ISO/IEC Directives, Part 1. In particular the different approval criteria needed for
the different types of document should be noted. This document was drafted in accordance with the
editorial rules of the ISO/IEC Directives, Part 2 (see www.iso.org/directives).
Attention is drawn to the possibility that some of the elements of this document may be the subject
of patent rights. ISO and IEC shall not be held responsible for identifying any or all such patent rights.
Details of any patent rights identified during the development of the document will be in the Introduction
and/or on the ISO list of patent declarations received (see www.iso.org/patents).
Any trade name used in this document is information given for the convenience of users and does not
constitute an endorsement.
For an explanation on the meaning of ISO specific terms and expressions related to conformity
assessment, as well as information about ISO’s adherence to the WTO principles in the Technical Barriers
to Trade (TBT), see the following URL: Foreword — Supplementary information.
The committee responsible for this document is ISO/TC JTC 1, Information technology, Subcommittee
SC 35, User interfaces.
ISO/IEC 30113 consists of the following parts, under the general title Information technology — User
interfaces — Gesture-based interfaces across devices and methods:
— Part 1: Framework
— Part 11: Single-point gestures for common system actions
iv © ISO/IEC 2015 – All rights reserved

---------------------- Page: 4 ----------------------
ISO/IEC 30113-1:2015(E)

Introduction
Gestures are used for performing a variety of commands (such as scrolling a Web page up) as an
alternative input method (to typing or using a mouse to select objects).
Given the limited number of basic gestures, the same gesture is often used for a variety of different
commands in different situations. It is important that wherever possible, these different commands are
similar to one another (i.e. by having a similar effect on different objects) so that users are not confused
about what a gesture will do in a given situation.
Standardized gesture descriptions and commands minimize user confusion when interacting with
various software systems and applications on various ICT devices. This International Standard is aimed
at designers and developers of software applications.
This International Standard is intended to help users to more easily navigate and control application
software on various ICT devices by standardizing gestures and gesture commands.
This part of ISO/IEC 30113 defines a framework of gesture-based interfaces to support interoperability
among gesture-based interfaces with various input devices and methods.
Subclause A.1 gives informative description about the structure of ISO/IEC 30113 in detail.
© ISO/IEC 2015 – All rights reserved v

---------------------- Page: 5 ----------------------
INTERNATIONAL STANDARD ISO/IEC 30113-1:2015(E)
Information technology — User interface — Gesture-based
interfaces across devices and methods —
Part 1:
Framework
1 Scope
This part of ISO/IEC 30113 defines a framework and guidelines for gesture-based interfaces across
devices and methods in supporting interoperability.
NOTE Some of these devices include mice, touch screens, touch pads, 3D mice, joysticks, game controllers,
wired gloves, depth-aware cameras, stereo cameras, Web cameras.
This part of ISO/IEC 30113 does not define or require specific technology for recognizing gesture of
users. It focuses on the description of a gesture and its functions for utilizing ICT systems.
NOTE Operation of a physical keyboard is not addressed in this part of ISO/IEC 30113.
2 Conformance
A gesture-based interface is conformant to this part of ISO/IEC 30113 if it meets all requirements of Clause 5.
3 Terms and definitions
For the purposes of this document, the following terms and definitions apply.
3.1
gesture
movement or posture of the whole body or parts of the body
3.2
gesture-based interface
gesture interface
user interface that provides information and controls for a user to accomplish specific tasks with the
interactive system by his/her gestures
[SOURCE: ISO 9241-171: 3.29]
3.3
gesture command
instruction to the system resulting from a gesture input by the user, e.g. select, move, delete
[SOURCE: ISO/IEC 14754:1999, 4.5]
3.4
gesture software
software for implementing gesture-based interface functionality including gesture recognition,
command processing, and feedback generation
Note 1 to entry: Gesture recognition software is usually contained within the operating system and specific
device drivers. Information on gestures that are recognized is made available to the operating system and/or the
application software, so that the intended command(s) are performed in response to the gesture.
© ISO/IEC ISO pub-date year – All rights reserved 1

---------------------- Page: 6 ----------------------
ISO/IEC 30113-1:2015(E)

4 Overview of gesture-based interface
4.1 General
Users can use gestures to interact with interface objects. Interface objects have representational
properties (e.g. how are they rendered to user) and operational properties (e.g. what do they do) that
can be effected by gestures.
Human-machine interaction involves a loop of execution and evaluation. A machine offers feed forward
and a user manipulates interface objects (execution). The machine displays feedbacks and new feed
forward (evaluation) and the user adjusts manipulation, and so on. The user produces gestures and the
machine understands them based on the properties of the gestures that it recognizes.
For a successful interaction, the machine needs an input device in order to collect gesture properties.
Those properties will be analysed by gesture software to compare those properties to pre-defined
gesture command properties, and then decide to operate associated functions.
Figure 1 illustrates a model of human-machine interaction based on a gesture-based interface. It presents
a schematic diagram of relationships among the user, gesture command, input device and machine
(ICT system) when the user utilizes a gesture-based interface during human-machine interaction. The
gesture-based interface includes hardware (physical) and software (logical) components. The input
device is the hardware which recognizes the gesture and sends its associated input signal to the ICT
system. The gesture software finds a command which is pre-defined and mapped to the input signal.
The application software generates its feedback to the user using the output device.
Input device
ICT device
Input Signal
User
Gesture SW
Command
Application SW
Output device
Figure 1 — Loop of human-machine interaction with a gesture-based interface
4.2 User’s actions for gesture input
A user generates actions for gesture inputs which are two-dimensional motions relative to its supporting
surface, two-dimensional or three-dimensional finger/hand/body postures/motions in a space,
postures/motions of fingers on a surface and so on. A gesture can also be generated by a tool, as an
extension of the body (such as: a wand, a pen, a mouse, a remote control or a glove).
Some gestures are controlled by a discrete body part such as one finger, several fingers, hand movement
or fingers associated to hand movement. Facial expression, eye gaze and eyelid blinking can also provide
a user’s action for gesture input. Other gestures might be generated with a whole body or a coordination
of several body parts coordination. They could involve arms, hand and fingers, and their coordination.
2 © ISO/IEC 2015 – All rights reserved

---------------------- Page: 7 ----------------------
ISO/IEC 30113-1:2015(E)

Physiological constraints which apply to gesture generation are important to take into account before
defining gestures. For example, some gestures are difficult to be produced with a mouse in the hand on
a 2D surface, however, easy to be produced with a finger on a 2D surface.
All gestures involve a clear and identifiable start, one or more action(s) and a clear and identifiable end
(as further discussed in A.3.4.2). Before performing a gesture, the user can initiate a gesture recognition
(where required to do so) by doing some action such as holding down a specific button on a device. Gesture
recognition might be automatically supported by the system without the need of any action beyond the
start of the gesture. The user generates a specific gesture (such as drawing ‘L by moving a mouse’) by
motions between a start and end state. The user ends the gesture input by arriving at some state that is
recognized by the system as indicating the end of the gesture. This end state might be included within
the gesture or might be presented with another input modality (such as a voice command).
The gesture generated by the user is then interpreted as a command in by the operating system or a
specific software application when the ICT system recognizing the gesture correctly.
4.3 Gesture input device
A gesture input device receives the interactions provided by a user and generates input signals to
be interpreted by the gesture software. Example of useful gesture input devices include mice, touch
screens, touch pads, 3D mice, joysticks, game controllers, wired gloves, depth-aware cameras, stereo
cameras, Web cameras and so on.
4.4 ICT system
Gesture software analyses the signals received from gesture input devices. The functions of the gesture
software include gesture recognition, command assignment and gesture feedback.
The gesture software recognizes pre-defined gestures from actions exercised by a user with a gesture
input device. Then the gesture software sends the associated gesture command to application software.
While the user generates a gesture, the gesture software might invoke a feedback signal via the ICT
system to the user. The feedback helps the user to notice whether the gesture command is properly
activated or not. The feedback might be rendered using sound, visual display and/or tactile display.
4.5 Cultural Adaptability
Since gestures are one of input mechanisms such as a keyboard and a voice command, they are subject
to internationalization/localization. Some gestures might be culturally dependent.
EXAMPLE Bulgarians nod to say “no” and shake their head for “yes”, while Americans nod to say “yes” and
shake their head for “no”.
4.6 Accessibility
Due to the complexity, some gestures might not be properly and/or completely exercised by users
with disabilities and/or elderly users. When gestures are defined for an ICT system, consideration of
accessibility for all users (including the disabled and/or the elderly) is important.
5 Requirements and recommendations
5.1 Activating/finishing a gesture
A gesture-based interface shall provide one (or more) method(s) for activating and finishing a gesture.
EXAMPLE A mouse with two buttons is used as a gesture input device and holding down the secondary
button of the mouse activates a gesture. By releasing the button, the user’s action for gesture input is finished.
These methods may be managed by the user or automatically managed by the system.
© ISO/IEC 2015 – All rights reserved 3

---------------------- Page: 8 ----------------------
ISO/IEC 30113-1:2015(E)

5.2 Performing a gesture
A gesture-based interface shall provide one (or more) method(s) for making a gesture.
NOTE Making a gesture is valid only when the system is actively receiving gesture input. In some systems,
the receipt of gesture input by a system can be activated and deactivated by the user.
EXAMPLE In a specific mouse gesture, the method for gesture formation is to move the mouse horizontally
or vertically within one stroke.
5.3 Feedback for confirming a gesture
A gesture-based interface should provide one (or more) feedback signal(s) to notify the user of the
current state of performing the gesture.
NOTE 1 Feedback can represent several states such as interface object selection, interface object activation,
interface object manipulation, gesture command initialisation state, gesture command performing state, gesture
command ending state and feedback about function execution.
Feedback should be expressed through one or more of the visual, tactile or audible modalities.
NOTE 2 When focus indicates an object is selected, the gesture command will apply to that specific object.
EXAMPLE 1 A visual trail line showing the movement of the pointer (mouse pointer) is displayed on a screen
when a gesture is performed.
EXAMPLE 2 An ICT system makes a sound as a signal announcing that the gesture command is recognized.
EXAMPLE 3 Changes to an object’s state are displayed after they are made by a gesture.
5.4 Feed forward
A gesture-based interface should provide clear feed forward signals to notify the user what kind of
gestures are done and when they are done.
NOTE As gestures are dynamic, dynamic feed forward is more effective.
EXAMPLE A visual clue helps the user to identify that a certain interface object can respond to some
gestural shortcuts.
5.5 Cancelling a gesture
The gesture-based interface should provide at least one cancelation method that can be used during the
input of a gesture.
EXAMPLE If gestural input exceeds a specified time limit, gesture command is cancelled.
5.6 Criteria of gesture size
To minimize misunderstanding of gesture input, the gesture-bas
...

Questions, Comments and Discussion

Ask us and Technical Secretary will try to provide an answer. You can facilitate discussion about the standard in here.