EN ISO 9241-960:2017
(Main)Ergonomics of human-system interaction - Part 960: Framework and guidance for gesture interactions (ISO 9241-960:2017)
Ergonomics of human-system interaction - Part 960: Framework and guidance for gesture interactions (ISO 9241-960:2017)
ISO 9241-960:2017 gives guidance on the selection or creation of the gestures to be used in a gesture interface. It addresses the usability of gestures and provides information on their design, the design process and relevant parameters that are to be considered. In addition, it provides guidance on how gestures should be documented. This document is concerned with gestures expressed by a human and not with the system response generated when users are performing these gestures.
NOTE 1 Specific gestures are standardized within ISO/IEC 14754 and the ISO/IEC 30113 series.
NOTE 2 Input devices such as tablets or spatial gesture recognition devices can capture gestures in 2D or 3D. All human gestures are 3D.
Ergonomie der Mensch-System-Interaktion - Teil 960: Rahmen und Anleitung zur Gestensteuerung (ISO 9241-960:2017)
Ergonomie de l'interaction homme-système - Partie 960: Cadre et lignes directrices relatives aux interactions gestuelles (ISO 9241-960:2017)
ISO 9241-960:2017 fournit des lignes directrices pour la sélection ou la création des gestes devant être utilisés dans une interface gestuelle. Il traite de l'utilisabilité des gestes et fournit des informations sur leur conception, le processus de conception et les paramètres pertinents à prendre en compte. En outre, il fournit des lignes directrices relatives à la manière dont il convient de documenter les gestes. Le présent document concerne les gestes effectués par un humain et non la réponse du système générée lorsque les utilisateurs effectuent ces gestes.
NOTE 1 Des gestes spécifiques sont normalisés par l'ISO/IEC 14754 et la série de normes ISO/IEC 30113.
NOTE 2 Les dispositifs d'entrée tels que les tablettes ou les dispositifs de reconnaissance spatiale de gestes peuvent capturer des gestes en 2D ou 3D. Tous les gestes humains sont en 3D.
Ergonomija medsebojnega vpliva človek-sistem - 960. del: Okvir in navodila za interakcijo kretenj (ISO 9241-960:2017)
Ta standard podaja smernice za izbiro ali ustvarjanje kretenj, ki se uporabljajo za vmesnike z nadzorom s kretnjami. Obravnava uporabnost kretenj in zagotavlja informacije o oblikovanju kretenj, postopku ter ustreznih parametrih. Poleg tega ta standard podaja smernice o tem, kako je treba dokumentirati kretnje. Standard obravnava kretnje, ki jih izvede človek, in ne obravnava sistemskega odziva, do katerega pride, ko uporabniki izvajajo te kretnje.
General Information
Standards Content (Sample)
SLOVENSKI STANDARD
01-februar-2018
(UJRQRPLMDPHGVHERMQHJDYSOLYDþORYHNVLVWHPGHO2NYLULQQDYRGLOD]D
LQWHUDNFLMRNUHWHQM,62
Ergonomics of human-system interaction - Part 960: Framework and guidance for
gesture interactions (ISO 9241-960:2017)
Ergonomie der Mensch-System-Interaktion - Teil 960: Rahmen und Anleitung zur
Gestensteuerung (ISO 9241-960:2017)
Ergonomie de l'interaction homme-système - Partie 960: Cadre et lignes directrices
relatives aux interactions gestuelles (ISO 9241-960:2017)
Ta slovenski standard je istoveten z: EN ISO 9241-960:2017
ICS:
13.180 Ergonomija Ergonomics
35.180 Terminalska in druga IT Terminal and other
periferna oprema IT peripheral equipment
2003-01.Slovenski inštitut za standardizacijo. Razmnoževanje celote ali delov tega standarda ni dovoljeno.
EN ISO 9241-960
EUROPEAN STANDARD
NORME EUROPÉENNE
October 2017
EUROPÄISCHE NORM
ICS 13.180; 35.180
English Version
Ergonomics of human-system interaction - Part 960:
Framework and guidance for gesture interactions (ISO
9241-960:2017)
Ergonomie de l'interaction homme-système - Partie Ergonomie der Mensch-System-Interaktion - Teil 960:
960: Cadre et lignes directrices relatives aux Rahmen und Anleitung zur Gestensteuerung (ISO
interactions gestuelles (ISO 9241-960:2017) 9241-960:2017)
This European Standard was approved by CEN on 6 July 2017.
CEN members are bound to comply with the CEN/CENELEC Internal Regulations which stipulate the conditions for giving this
European Standard the status of a national standard without any alteration. Up-to-date lists and bibliographical references
concerning such national standards may be obtained on application to the CEN-CENELEC Management Centre or to any CEN
member.
This European Standard exists in three official versions (English, French, German). A version in any other language made by
translation under the responsibility of a CEN member into its own language and notified to the CEN-CENELEC Management
Centre has the same status as the official versions.
CEN members are the national standards bodies of Austria, Belgium, Bulgaria, Croatia, Cyprus, Czech Republic, Denmark, Estonia,
Finland, Former Yugoslav Republic of Macedonia, France, Germany, Greece, Hungary, Iceland, Ireland, Italy, Latvia, Lithuania,
Luxembourg, Malta, Netherlands, Norway, Poland, Portugal, Romania, Serbia, Slovakia, Slovenia, Spain, Sweden, Switzerland,
Turkey and United Kingdom.
EUROPEAN COMMITTEE FOR STANDARDIZATION
COMITÉ EUROPÉEN DE NORMALISATION
EUROPÄISCHES KOMITEE FÜR NORMUNG
CEN-CENELEC Management Centre: Avenue Marnix 17, B-1000 Brussels
© 2017 CEN All rights of exploitation in any form and by any means reserved Ref. No. EN ISO 9241-960:2017 E
worldwide for CEN national Members.
Contents
European foreword . 3
European foreword
This document (EN ISO 9241-960:2017) has been prepared by Technical Committee ISO/TC 159
“Ergonomics” in collaboration with Technical Committee CEN/TC 122 “Ergonomics” the secretariat of
which is held by DIN.
This European Standard shall be given the status of a national standard, either by publication of an
identical text or by endorsement, at the latest by April 2018, and conflicting national standards shall be
withdrawn at the latest by April 2018.
Attention is drawn to the possibility that some of the elements of this document may be the subject of
patent rights. CEN shall not be held responsible for identifying any or all such patent rights.
According to the CEN-CENELEC Internal Regulations, the national standards organizations of the
following countries are bound to implement this European Standard: Austria, Belgium, Bulgaria,
Croatia, Cyprus, Czech Republic, Denmark, Estonia, Finland, Former Yugoslav Republic of Macedonia,
France, Germany, Greece, Hungary, Iceland, Ireland, Italy, Latvia, Lithuania, Luxembourg, Malta,
Netherlands, Norway, Poland, Portugal, Romania, Serbia, Slovakia, Slovenia, Spain, Sweden, Switzerland,
Turkey and the United Kingdom.
Endorsement notice
The text of ISO 9241-960:2017 has been approved by CEN as EN ISO 9241-960:2017 without any
modification.
INTERNATIONAL ISO
STANDARD 9241-960
First edition
2017-09
Ergonomics of human-system
interaction —
Part 960:
Framework and guidance for gesture
interactions
Ergonomie de l'interaction homme-système —
Partie 960: Cadre et lignes directrices relatives aux interactions
gestuelles
Reference number
ISO 9241-960:2017(E)
©
ISO 2017
ISO 9241-960:2017(E)
© ISO 2017, Published in Switzerland
All rights reserved. Unless otherwise specified, no part of this publication may be reproduced or utilized otherwise in any form
or by any means, electronic or mechanical, including photocopying, or posting on the internet or an intranet, without prior
written permission. Permission can be requested from either ISO at the address below or ISO’s member body in the country of
the requester.
ISO copyright office
Ch. de Blandonnet 8 • CP 401
CH-1214 Vernier, Geneva, Switzerland
Tel. +41 22 749 01 11
Fax +41 22 749 09 47
copyright@iso.org
www.iso.org
ii © ISO 2017 – All rights reserved
ISO 9241-960:2017(E)
Contents Page
Foreword .v
Introduction .vi
1 Scope . 1
2 Normative references . 1
3 Terms and definitions . 1
4 General . 2
4.1 Need for a standard on gesture usability . 2
4.2 Usage. 2
4.3 Intentional and unintentional gestures . 3
4.4 Matching gestures and functionality . 3
5 Ergonomics of gestures. 4
5.1 Ergonomic constraints and features . 4
5.2 Device capabilities . 4
5.3 Device constraints . 4
6 Guidance in defining gestures . 5
6.1 Process for gesture definition . 5
6.1.1 General. 5
6.1.2 Exploring the design space . 5
6.1.3 Identifying purposes . 6
6.1.4 Designing gestures and gesture commands . 7
6.1.5 Organizing gesture sets . 7
6.1.6 Evaluating gestures . 7
6.1.7 Iterating the gesture interface. 8
6.1.8 Documenting gestures . 8
6.1.9 Explaining gestures . 8
6.2 Features of gestures . 8
6.2.1 Mapping of gesture commands to functions . 8
6.2.2 Nested gestures . 8
6.2.3 Feedback for stroke gestures . 9
6.2.4 Continuous feedback for gesture commands . 9
6.2.5 Use of feedforward information for stroke gestures . 9
6.2.6 Parameters of gesture commands .10
6.3 Timing and speed .10
6.3.1 Recognition of a gesture at different speeds.10
6.3.2 Use of the speed of a gesture .10
6.4 Tolerance of gesture interface .10
6.5 Sequences of gestures .10
6.5.1 Beginning a gesture .10
6.5.2 Feedback on gesture initiation .10
6.5.3 Completing the purpose of a gesture .10
6.5.4 Feedback on gesture completion .10
6.5.5 The need for transition between gestures .11
6.5.6 The effect of transitions between gestures .11
6.5.7 Overlapping gestures .11
6.5.8 State changes . . .11
6.6 Gesture sets .11
6.6.1 General.11
6.6.2 Purpose of a set of gestures .11
6.6.3 Consistency among gestures .11
6.6.4 Discriminability of gestures .12
6.6.5 Subsets within a gesture set .12
6.6.6 Alternative subsets within a gesture set .12
ISO 9241-960:2017(E)
6.7 Documentation of gestures .12
6.7.1 Documentation .12
6.7.2 Naming a gesture .13
6.7.3 Visualization of gestures .13
6.7.4 Textual documentation of a gesture .13
6.7.5 Describing the purpose of the gesture .14
6.7.6 Documenting a gesture set .14
6.7.7 Documenting gestures with common movements .14
Annex A (informative) When to use applications of gestures and gesture commands .15
Annex B (informative) Taxonomies for documentation of gestures .21
Bibliography .23
iv © ISO 2017 – All rights reserved
ISO 9241-960:2017(E)
Foreword
ISO (the International Organization for Standardization) is a worldwide federation of national standards
bodies (ISO member bodies). The work of preparing International Standards is normally carried out
through ISO technical committees. Each member body interested in a subject for which a technical
committee has been established has the right to be represented on that committee. International
organizations, governmental and non-governmental, in liaison with ISO, also take part in the work.
ISO collaborates closely with the International Electrotechnical Commission (IEC) on all matters of
electrotechnical standardization.
The procedures used to develop this document and those intended for its further maintenance are
described in the ISO/IEC Directives, Part 1. In particular the different approval criteria needed for the
different types of ISO documents should be noted. This document was drafted in accordance with the
editorial rules of the ISO/IEC Directives, Part 2 (see www.iso.org/directives).
Attention is drawn to the possibility that some of the elements of this document may be the subject of
patent rights. ISO shall not be held responsible for identifying any or all such patent rights. Details of
any patent rights identified during the development of the document will be in the Introduction and/or
on the ISO list of patent declarations received (see www.iso.org/patents).
Any trade name used in this document is information given for the convenience of users and does not
constitute an endorsement.
For an explanation on the voluntary nature of standards, the meaning of ISO specific terms and
expressions related to conformity assessment, as well as information about ISO's adherence to the
World Trade Organization (WTO) principles in the Technical Barriers to Trade (TBT) see the following
URL: www.iso.org/iso/foreword.html.
This document was prepared by Technical Committee ISO/TC 159, Ergonomics, Subcommittee SC 4,
Ergonomics of human-system interaction.
A list of all parts in the ISO 9241 series can be found on the ISO website.
ISO 9241-960:2017(E)
Introduction
Tactile and haptic interactions are becoming increasingly important as candidate interaction modalities
in computer systems such as special purpose computing environments (e.g. tablets), wearable
technology (e.g. tactile arrays, instrumented gloves), and assistive technologies.
Tactile and haptic devices are being developed in university and industrial laboratories in many
countries. Both the developer and the prospective purchaser of such devices need a means of making
comparisons between competing devices and common design of interactions.
This document focuses on gestures and identification of gesture sets as a specific type of tactile/haptic
interaction. It explains how to describe their features, and what factors to take into account when
defining gestures.
ISO 9241-910 provides a common set of terms, definitions and descriptions of the various concepts
central to designing and using tactile/haptic interactions. It also provides an overview of the range of
tactile/haptic applications, objects, attributes, and interactions.
ISO 9241-920 provides basic guidance (including references to related standards) in the design of
tactile/haptic interactions.
ISO 9241-940 (under preparation) is to provide ways of evaluating tactile/haptic interactions for various
aspects of interaction quality (such as haptic device attributes, logical space design and usability).
vi © ISO 2017 – All rights reserved
INTERNATIONAL STANDARD ISO 9241-960:2017(E)
Ergonomics of human-system interaction —
Part 960:
Framework and guidance for gesture interactions
1 Scope
This document gives guidance on the selection or creation of the gestures to be used in a gesture
interface. It addresses the usability of gestures and provides information on their design, the design
process and relevant parameters that are to be considered. In addition, it provides guidance on how
gestures should be documented. This document is concerned with gestures expressed by a human and
not with the system response generated when users are performing these gestures.
NOTE 1 Specific gestures are standardized within ISO/IEC 14754 and the ISO/IEC 30113 series.
NOTE 2 Input devices such as tablets or spatial gesture recognition devices can capture gestures in 2D or 3D.
All human gestures are 3D.
2 Normative references
The following documents are referred to in the text in such a way that some or all of their content
constitutes requirements of this document. For dated references, only the edition cited applies. For
undated references, the latest edition of the referenced document (including any amendments) applies.
ISO 9241-910, Ergonomics of human-system interaction — Part 910: Framework for tactile and haptic
interaction
3 Terms and definitions
For the purposes of this document, the terms and definitions given in ISO 9241-910 and the
following apply.
ISO and IEC maintain terminological databases for use in standardization at the following addresses:
— ISO Online browsing platform: available at http://www.iso.org/obp
— IEC Electropedia: available at http://www.electropedia.org/
3.1
feedforward gesture information
information provided by the gesture interface (3.4) to maintain consistency of a body part’s movement
with predicted single or multiple gesture trajectories
EXAMPLE A gesture might be visualized through inking the trajectory on the display. Several choices of
possible future trajectories can be inked, thereby helping the user to complete the gesture.
Note 1 to entry: Feedforward gestural information improves self-explanation of the gestural interface.
3.2
gesture
movement or posture, of the whole body or parts of the body
Note 1 to entry: Operation of a physical keyboard is not addressed in this document.
[SOURCE: ISO/IEC 30113-1, 3.1]
ISO 9241-960:2017(E)
3.3
gesture command
instruction to the system resulting from a gesture input by the user, e.g. select, move, delete
[SOURCE: ISO/IEC 14574:1999, 4.5]
3.4
gesture interface
user interface that provides information and controls for a user to accomplish specific tasks with the
interactive system by his/her gestures (3.2)
[SOURCE: ISO 9241-171:2008, 3.29 — Modified]
3.5
gesture set
grouping of gestures and their mapping to gesture commands (3.3)
EXAMPLE The conductor of a virtual orchestra uses a gesture set for a music performance.
3.6
intentional gesture
movement of the body or parts of the body to achieve a purpose
3.7
stroke gesture
intentional gesture (3.6) consisting of a movement trajectory of any part of the body
Note 1 to entry: As with other gestures, the definition refers to the movement itself, rather than its effect.
Different gesture commands, including direct manipulation, could be defined for a stroke gesture.
Note 2 to entry: The gesture command is not dependent on the extent of the movement trajectory.
Note 3 to entry: Pressure can be used as a parameter of the gesture.
3.8
direct manipulation
dialogue technique by which the user has the impression of acting directly on objects on the screen; for
example by pointing at them, moving them and/or changing their physical characteristics (or values)
via the use of an input device
[SOURCE: ISO 9241-16:1999, 3.6]
4 General
4.1 Need for a standard on gesture usability
When pointing devices such as the mouse were developed in the 1960s, movement of the human hand
became part of interactive systems. It took until the mid-1980s for the mouse to become standard in the
office context. With the advent of multi-touch displays and 3D cameras, gestures appear to be a highly
usable alternative to a tiny keyboard on a mobile device. The wide use of gestural interfaces makes it
important to consider their usability.
4.2 Usage
Gestures may accompany language in order to strengthen what has been said. Such gestures are
described in linguistics as “deixis” (pronounced “dīk-sis” or “dāk-sis”). The term “deixis” refers to
words such as in “Put that there” which require contextual information provided by pointing in order
to be fully understood. Gestures may convey their own meaning inherent to the actual movement of
some body part and independent of some tangible physical object such as a pen or mouse. When using
a pointing device while gesturing, the information and communication technology (ICT) system often
2 © ISO 2017 – All rights reserved
ISO 9241-960:2017(E)
restricts the movements because of limitations in the ability of the movement-tracking device. Gestures,
like language, are culture-specific and misunderstandings may arise from inappropriate use of them.
4.3 Intentional and unintentional gestures
In designing gesture sets, emphasis is often placed on adopting gestures that are intentional or
unintentional with respect to the system. A typical example of an intentional gesture is pointing at
an object in order to select it, or waving your hand in front of a door to open it. Unintentional gestures
in this context are gestures made for some other purpose (e.g. walking towards an automatic door,
sitting down in the driver seat of a car), or gestures made subconsciously (e.g. body language). Such
unintentional learnable gestures are particularly suited to general situations where the user might not
be trained, when the user must learn the system quickly, or when the user must use the system under
conditions of stress (e.g. time pressure).
Intentionality in gestures could also enable increased discriminability between them, thereby reducing
inadvertent activation. For example, when it is desired not to activate an automatic door, many people
stand still and avoid gesturing in front of the doors, knowing they are prone to open unintentionally.
4.4 Matching gestures and functionality
A gesture is the result of the user’s intention to create a message for a recipient or computer while
mapping it to the movement of the body or parts of the body, typically the upper limbs. Figure 1
illustrates variations of the intention applicable when gestures are expressed to an ICT system. The
user on the left is interacting with a gesture interface on the right, using a selection of gestures from
a gesture set. The user has an intention to transmit, and can make use of posture and movement.
His choice of gestures may be intentional, or unintentional, depending on the situation. The gesture
interface could provide feedback on the system's interpretation of the gesture, or even feedforward
information to aid the user in completing the gesture (see 6.2 for further guidance on gesture features).
There is a continuum between interpreting gestures when controlling physical artefacts, such as
directly manipulating a slider, and interpreting a gesture as some abstract symbol. Another continuum
of mappings exists between matching gesture sets with the functionality of an interactive system
overall and its context of use.
Identification of unintentional gestures is often avoided by requiring the user to signal the start and
end of a gesture explicitly through some technical approach such as touching/releasing a screen with
the fingers. All such touches will be interpreted as intentional gestures.
Mappings should take existing manual operations such as handwriting into account. Simple handwriting
might be applicable to gestural interpretation but, typically, handwritten language is far more complex
than a gesture vocabulary.
The matching process is applicable to user-centred design principles and, therefore, evaluation
methods can be applied. ISO 9241-940 provides guidelines on how to evaluate gestures to be used with
tactile/haptic devices. Some user groups can have special needs. In addressing them, a special set of
gestures might be required, or completely different input alternatives might be needed.
EXAMPLE 1 A multi-touch gesture consisting of circulating thumb and forefinger around each other while
touching a screen can be interpreted as a gesture command to change the orientation of an image. However, it
can also be seen as the direct manipulation of the image's orientation if its presentation is updated continuously.
EXAMPLE 2 Switching between intentional and unintentional gestures occurs commonly on haptic devices.
Blind people read braille with a finger while touching a braille display. At the same time such finger movements
can express some intentional or unintentional gesture, if the braille display is touch sensitive. Technically,
disambiguation can be based on the position or speed of movement of the finger over the tactile display. On one
hand the reader can read braille being not aware of any such monitoring, on the other hand the intention can be
formed to turn reading movements into gestural input.
NOTE The overall gestural interaction between the user and the ICT system is not discussed here and
requires further guidance.
ISO 9241-960:2017(E)
Figure 1 — Overview of gestures made by a user for a gesture interface
5 Ergonomics of gestures
5.1 Ergonomic constraints and features
Gestures that are performed repetitively should not create unnecessary fatigue in the body parts which
are to be posed or moved during the gesture.
a) Users should be involved in determining the need for such repetitions.
b) If repetitive gesturing is unavoidable, hazard identification, risk estimation, risk evaluation and
risk reduction should be performed in order to avoid musculoskeletal disorders.
NOTE 1 On vertically mounted touch-screens the "gorilla-arm-syndrome" can be observed after long periods
of gestural input. The gorilla-arm-symptom refers to fatigue in placing and moving the unsupported arms in
front of the body.
NOTE 2 Children and those with reduced dexterity and joint mobility might produce less pronounced gestures.
5.2 Device capabilities
A device for receiving gestures should have the capability of detecting the trajectory of a stroke or a pose
within all conditions imposed by the environment. The gesture set is defined for the entire context of use.
NOTE A single touch or multiple touches at the same time are examples of poses to be recognized by all
devices capable of recognizing gestures.
5.3 Device constraints
A device for receiving gestures can restrict the trajectory or pose that a human intends to form. The
user should be made aware of these restrictions.
NOTE 1 Digitizing pens can be designed to write when the tip approaches a surface and delete if the opposite
end of the pen is used (eraser end). The reference point for pen gestures is either the tip of the pen or the eraser.
The user can be made aware of the spatial volume within which the gesture could be performed by designing the
grip of a pen symmetrically or asymmetrically.
4 © ISO 2017 – All rights reserved
ISO 9241-960:2017(E)
NOTE 2 When using a camera based system for 3D gesture recognition, the user benefits from being made
aware of the area the camera is able to cover.
6 Guidance in defining gestures
6.1 Process for gesture definition
6.1.1 General
The process for gesture definition shall follow the guidance provided by ISO 9241-210 and should
consider the principles outlined in ISO 9241-110 where appropriate.
Gestures have particular benefits and drawbacks with respect to the seven ergonomic principles
expressed in ISO 9241-110 and which are illustrated in the following.
a) Gestural input could be the only style of interaction with a system suitable for completing tasks
(the primary task).
b) Gestural interfaces can support self-descriptiveness by, for example, feedforward of gestural
information, but they often require memorization.
c) Gestures should be suitable for learning and may become procedural knowledge. However, gestures
tend to be forgotten if not used regularly. This might be addressed by documentation of gesture
sets, or by a training sequence whereby gestures might be learned with the system in “safe” mode.
d) Controllability of gestural interaction is often limited, since an aborted gesture is an incomplete
gesture and hence no gesture command can be determined. A gesture set may be combined with
accompanying feedback supporting gesture formation in order to improve the usability of the
interaction.
e) Consistency of gestures could depend on the context of use and the device when gestures are
being used. Consistency can be improved if pre-existing gestures can be utilized in designing an
interactive system.
f) Gestures are suitable for individualization, for example, by accepting user-defined gestures
and by provision of mechanisms to change the mapping of gestures and gesture commands.
Individualization can also be achieved by designing several gestures for the same purpose.
g) In order to achieve error tolerance, users should be made aware of the device's ability to process
intentional gestures.
6.1.2 Exploring the design space
6.1.2.1 Explore design space generally
The potentially available design alternatives, including the design rationale, should be explored for the
intended users and contexts of use.
6.1.2.2 Widely explore human movements
a) The investigation should include not only the hands, but also limbs, and full body movements as
well as head and eye movements and other facial expressions.
b) The gesture interface typically needs to be useable for a wide range of users. Alternative body parts
for gestures, range of motion, tolerance of tremors, ability for simultaneous action, and ability to
walk should be considered for better accessibility.
ISO 9241-960:2017(E)
EXAMPLE 1 Often, the forefinger of the dominant hand is considered for pointing. However, the use of thumb
or forefinger of the non-dominant part might be used for pointing equally well. In some cultures, pointing with
the chin is common and natural.
EXAMPLE 2 To repeat sound information, a pointing gesture (click on a button) can be applied, but shaking a
hand-held device might be used instead.
EXAMPLE 3 To get information about a location, you might click on a point of interest in a map, while an
alternative way to request the same information is to walk to the location in question.
EXAMPLE 4 A drawing can be generated with a hand-held pen (hand gestures) while a person without hands
may use their foot to accomplish the same task.
EXAMPLE 5 On-screen gestures designed for two-handed use may be performed in a one-handed manner by a
person holding an object in the other hand.
6.1.2.3 Explore single and synchronized simultaneous movements
Exploration of gestures should consider not only movement of single body parts but also synchronized
coordinated movements of multiple body parts.
EXAMPLE 1 A multitouch gesture, such as dragging fingers together, may be more suitable for grouping than
dragging items individually using single touch.
EXAMPLE 2 A multitouch gesture such as pointing by the forefinger and tapping by another finger can allow
blind people to explore a mobile device by spoken feedback and to subsequently select an item.
EXAMPLE 3 Using two hands (e.g. clapping) can be an intuitive gesture of command (e.g. attention).
6.1.2.4 Explore simultaneous and sequential movements
Exploration of gestures should consider both simultaneous and sequential movements.
EXAMPLE A user communicating with an assistive robot may first point to an object of interest, then gesture
for the robot to “fetch” the object.
6.1.2.5 Explore movements made by multiple users
a) The design should include consideration of gestures performed by multiple users independently as
well as gestures formed collaboratively by two or more users.
b) Social acceptance of gestures should be considered. Gestures performed in the personal space
might be considered inappropriate.
EXAMPLE 1 A handshake can be used in a gestural interface between a user and a system that is aware of
social signs or formalities.
EXAMPLE 2 Social robots can be programmed to perceive, interpret, and return a head-bowing gesture.
6.1.3 Identifying purposes
Developers shall identify the purposes for which humans need to express gestures in relation to the
ICT system.
EXAMPLE The volume of a TV might be changed by a gesture; other functions such as channel changing,
muting or initiating recording could be considered.
6 © ISO 2017 – All rights reserved
ISO 9241-960:2017(E)
6.1.4 Designing gestures and gesture commands
6.1.4.1 Developer-defined gestures
Developers should identify at least one gesture for each gesture command based on the following
sources of gestures, in order of descending priority:
a) pre-existing gesture in the culture;
b) internationally standardized gestures;
c) gestures suggested by one or many users;
d) gestures suggested by the context of use;
e) gestures from other contexts of use;
f) gestures typical to the devices being used;
g) gestures suggested within a design team.
People sharing the same culture or social setting often share the same gestures. When designing a
gesture set, these pre-existing gestures should be identified and considered for inclusion in a gesture set.
NOTE Gestures may serve different purposes in completing a task depending on different social or cultural
contexts. Most gestures do not have invariable or universal intention.
6.1.4.2 User-defined gestures
User-defined gestures can be used as alternatives for pre-defined gesture commands or as additional
gestures for other outcomes/actions/functions.
User-defined gesture sets should be available over a wide range of contexts of use and across devices
when appropriate.
NOTE 1 User-defined gestures support accessibility in that they allow users to tailor the gestures to their
abilities and needs.
NOTE 2 Some users might want to modify the range of movements to be considered within a gesture set.
6.1.4.3 Identify variability of gestures
The error tolerance of the gesture movement should be determined and shall be commensurate with 6.6.4.
EXAMPLE A database of gestures recorded from users in a particular context might be analysed to identify
the similarity of each gesture. Variability is often related to differences in speed of movement, size, orientation,
and the body parts involved.
6.1.5 Organizing gesture sets
Gestures may be organized into a gesture set according to guidance in 6.6.
6.1.6 Evaluating gestures
Developer-defined gestures shall be evaluated to ensure they meet the needs of the intended users
within the intended contexts of use.
NOTE ISO 9241-940 provides further information on evaluation of tactile/haptic interaction.
EXAMPLE 1 A gesture is effective if users are able to identify the purpose of the gesture.
ISO 9241-960:2017(E)
EXAMPLE 2 When developing new stroke gestures, low-fidelity prototyping, such as painting gestures by
fingers, might be utilized to identify the gestures.
EXAMPLE 3 Video sketching allows recording a mock-up of gesturing together with spoken commands, such
as the explanation “next page”.
EXAMPLE 4 A "Wizard of Oz" study allows mocking up of gesturing together with the system response
mimicked by a human.
6.1.7 Iterating the gesture interface
The design process of the gesture interface should be iterative according to ISO 9241-210, taking both
the interactive system and its context of use into account.
6.1.8 Documenting gestures
Documentation of gestures should be suitable for the intended users and contexts of use (see 6.7).
6.1.9 Explaining gestures
Gestures and gesture commands shall be described in documentation available to the user. This
documentation should be offered at the initialization of the gestural interface.
Training users in gestures
a) Users should be able to explore gestures without adversely affecting system content.
b) Users should be able to explore the possible trajectories or poses involved in a gesture.
NOTE A training mode is a useful tool to allow the user to explore the gestures available in a system.
6.2 Features of gestures
6.2.1 Mapping of gesture commands to functions
Designers of gestures should ensure that the mapping of the gesture command to a function is consistent
with user expectations.
NOTE 1 Multiple gestures may be assigned to the same function.
NOTE 2 The same gesture may have different purposes in different contexts within the application.
NOTE 3 A gesture might modify the application context for one or more following gestures.
EXAMPLE 1 A pointing gesture may mean “bring that object” or “go there” depending on context or modality.
EXAMPLE 2 A single stroke gesture could be used to change pages or to move an object, depending on the
application context.
6.2.2 Nested gestures
In pursuing a secondary task, specific gestures might be performed while a primary task is underway
(or halted).
NOTE This might lead to micro gestures if the palm is already grasping some object and only fingers can
be moved.
EXAMPLE 1 In the primary task, a user might be touching a virtual globe using a finger while performing eye
gestures to control zooming in a secondary task.
EXAMPLE 2 When controlling, for example, a steering wheel, the arm/hand combination might already be
creating some force on the grip while a finger on the same hand creates gestures for choosing options in a menu.
8 © ISO 2017 – All rights reserved
ISO 9241-960:2017(E)
EXAMPLE 3 In accessibility, reading braille may be combined with gestures to end reading movements and
initiate spoken feedback. In this example, the hands may even change their role in pursuing the primary and
secondary tasks.
EXAMPLE 4 In sign language, hand signs are combined with facial expression in order to disambiguate the
hand signs.
EXAMPLE 5 A conductor might simultaneously use gestures to continually communicate tempo and
coordinate introduction or emphasis of particular instruments.
6.2.3 Feedback for stroke gestures
If feedback is generated for a gesture, the user should be made aware of strokes not recognized as part
of a gesture.
NOTE 1 The user might not need feedback from the application, such as from visualizing its trajectory, if
kinaesthetic feedback is sufficient.
NOTE 2 Typically up to 95 % of stroke gestures are recognized correctly in existing systems.
EXAMPLE A gesture might accompany other input methods such as sp
...








Questions, Comments and Discussion
Ask us and Technical Secretary will try to provide an answer. You can facilitate discussion about the standard in here.
Loading comments...