Information technology — Computer graphics, image processing and environment data representation — Object/environmental representation for image-based rendering in virtual/mixed and augmented reality (VR/MAR)

This document specifies an image-based representation model that represents target objects/environments using a set of images and optionally the underlying 3D model for accurate and efficient objects/environments representation at an arbitrary viewpoint. It is applicable to a wide range of graphic, virtual reality and mixed reality applications which require the method of representing a scene with various objects and environments. This document: — defines terms for image-based representation and 3D reconstruction techniques; — specifies the required elements for image-based representation; — specifies a method of representing the real world in the virtual space based on image-based representation; — specifies how visible image patches can be integrated with the underlying 3D model for more accurate and rich objects/environments representation from arbitrary viewpoints; — specifies how the proposed model allows multi-object representation; — provides an XML based specification of the proposed representation model and an actual implementation example (see Annex A).

Technologies de l'information — Infographie, traitement d'images et représentation des données environnementales — Représentation d'objets/environnements pour l'habillage à partir d'images réelles dans la réalité virtuelle/mixte et augmentée (VR/MAR)

General Information

Status
Published
Publication Date
04-May-2022
Current Stage
6060 - International Standard published
Start Date
05-May-2022
Due Date
17-Jan-2022
Completion Date
05-May-2022
Ref Project

Buy Standard

Standard
ISO/IEC 23488:2022 - Information technology — Computer graphics, image processing and environment data representation — Object/environmental representation for image-based rendering in virtual/mixed and augmented reality (VR/MAR) Released:5/5/2022
English language
15 pages
sale 15% off
Preview
sale 15% off
Preview

Standards Content (Sample)

INTERNATIONAL ISO/IEC
STANDARD 23488
First edition
2022-05
Information technology — Computer
graphics, image processing and
environment data representation —
Object/environmental representation
for image-based rendering in virtual/
mixed and augmented reality (VR/
MAR)
Technologies de l'information — Infographie, traitement d'images
et représentation des données environnementales — Représentation
d'objets/environnements pour l'habillage à partir d'images réelles
dans la réalité virtuelle/mixte et augmentée (VR/MAR)
Reference number
ISO/IEC 23488:2022(E)
© ISO/IEC 2022

---------------------- Page: 1 ----------------------
ISO/IEC 23488:2022(E)
COPYRIGHT PROTECTED DOCUMENT
© ISO/IEC 2022
All rights reserved. Unless otherwise specified, or required in the context of its implementation, no part of this publication may
be reproduced or utilized otherwise in any form or by any means, electronic or mechanical, including photocopying, or posting on
the internet or an intranet, without prior written permission. Permission can be requested from either ISO at the address below
or ISO’s member body in the country of the requester.
ISO copyright office
CP 401 • Ch. de Blandonnet 8
CH-1214 Vernier, Geneva
Phone: +41 22 749 01 11
Email: copyright@iso.org
Website: www.iso.org
Published in Switzerland
ii
  © ISO/IEC 2022 – All rights reserved

---------------------- Page: 2 ----------------------
ISO/IEC 23488:2022(E)
Contents Page
Foreword .iv
Introduction .v
1 Scope . 1
2 Normative references . 1
3 Terms and definitions and abbreviated terms . 1
3.1 Terms and definitions . 1
3.2 Abbreviated terms . 2
4 Domain and concepts .2
4.1 General . 2
4.2 Domain. 2
4.3 Concepts . 3
4.4 Basic components . 4
4.4.1 General . 4
4.4.2 Image set . 4
4.4.3 3D model . 5
4.4.4 3D model — Image set integration . 6
4.4.5 XML based object model . 6
5 Image-based representation usage example . 8
5.1 General . 8
5.2 Image-based rendering . 8
5.3 Multi-object representation . 9
6 Conformance . 9
6.1 Objective. 9
6.2 Minimum requirements . 10
Annex A (informative) Working example of the proposed information model .11
Bibliography .15
iii
© ISO/IEC 2022 – All rights reserved

---------------------- Page: 3 ----------------------
ISO/IEC 23488:2022(E)
Foreword
ISO (the International Organization for Standardization) and IEC (the International Electrotechnical
Commission) form the specialized system for worldwide standardization. National bodies that are
members of ISO or IEC participate in the development of International Standards through technical
committees established by the respective organization to deal with particular fields of technical
activity. ISO and IEC technical committees collaborate in fields of mutual interest. Other international
organizations, governmental and non-governmental, in liaison with ISO and IEC, also take part in the
work.
The procedures used to develop this document and those intended for its further maintenance
are described in the ISO/IEC Directives, Part 1. In particular, the different approval criteria
needed for the different types of document should be noted. This document was drafted in
accordance with the editorial rules of the ISO/IEC Directives, Part 2 (see www.iso.org/directives or
www.iec.ch/members_experts/refdocs).
Attention is drawn to the possibility that some of the elements of this document may be the subject
of patent rights. ISO and IEC shall not be held responsible for identifying any or all such patent
rights. Details of any patent rights identified during the development of the document will be in the
Introduction and/or on the ISO list of patent declarations received (see www.iso.org/patents) or the IEC
list of patent declarations received (see https://patents.iec.ch).
Any trade name used in this document is information given for the convenience of users and does not
constitute an endorsement.
For an explanation of the voluntary nature of standards, the meaning of ISO specific terms and
expressions related to conformity assessment, as well as information about ISO's adherence to
the World Trade Organization (WTO) principles in the Technical Barriers to Trade (TBT) see
www.iso.org/iso/foreword.html. In the IEC, see www.iec.ch/understanding-standards.
This document was prepared by Joint Technical Committee ISO/IEC JTC 1, Information technology,
Subcommittee SC 24, Computer graphics, image processing and environmental data representation.
Any feedback or questions on this document should be directed to the user’s national standards
body. A complete listing of these bodies can be found at www.iso.org/members.html and
www.iec.ch/national-committees.
iv
  © ISO/IEC 2022 – All rights reserved

---------------------- Page: 4 ----------------------
ISO/IEC 23488:2022(E)
Introduction
As virtual reality (VR) and augmented reality (AR) expand to applications in entertainment and
education industries, many methods of augmenting reality to virtual space have been developed.
Because of this expansion, the technology of capturing and representing objects in real environments
is in high demand.
One of the proposed methods of capturing the real world is image-based representation. Image-based
representation is a technique that can be used in various applications that require 3D model rendering
at an arbitrary viewpoint, including virtual reality, augmented reality and video stabilization. Since
image-based representation is a predominant alternative to using 3D models in the growing VR/MAR
market, due to its realism, scalability, accuracy and efficiency, creating a standard for image-based
representation is required.
v
© ISO/IEC 2022 – All rights reserved

---------------------- Page: 5 ----------------------
INTERNATIONAL STANDARD ISO/IEC 23488:2022(E)
Information technology — Computer graphics, image
processing and environment data representation —
Object/environmental representation for image-based
rendering in virtual/mixed and augmented reality (VR/
MAR)
1 Scope
This document specifies an image-based representation model that represents target objects/
environments using a set of images and optionally the underlying 3D model for accurate and efficient
objects/environments representation at an arbitrary viewpoint. It is applicable to a wide range of
graphic, virtual reality and mixed reality applications which require the method of representing a
scene with various objects and environments.
This document:
— defines terms for image-based representation and 3D reconstruction techniques;
— specifies the required elements for image-based representation;
— specifies a method of representing the real world in the virtual space based on image-based
representation;
— specifies how visible image patches can be integrated with the underlying 3D model for more
accurate and rich objects/environments representation from arbitrary viewpoints;
— specifies how the proposed model allows multi-object representation;
— provides an XML based specification of the proposed representation model and an actual
implementation example (see Annex A).
2 Normative references
There are no normative references in this document.
3 Terms and definitions and abbreviated terms
3.1 Terms and definitions
For the purposes of this document, the following terms and definitions apply.
ISO and IEC maintain terminology databases for use in standardization at the following addresses:
— ISO Online browsing platform: available at https:// www .iso .org/ obp
— IEC Electropedia: available at https:// www .electropedia .org/
3.1.1
camera parameter
extrinsic/external and intrinsic/internal attribute value of a (virtual or real) camera that describes
the mathematical relationship between the 3D coordinates of a point in the world space and the 2D
coordinates of its projection onto the image space
1
© ISO/IEC 2022 – All rights reserved

---------------------- Page: 6 ----------------------
ISO/IEC 23488:2022(E)
3.1.2
image patch
set of connected pixels of an image that has the same visibility information
Note 1 to entry: Image patch is synonymous to image.
3.1.3
tessellation
process of dividing a face into sub-faces using sub-sampled vertices to attain enhanced visibility
detection
3.1.4
viewpoint
rotation and position with respect to the reference coordinate system that determines visible parts of a
3D model in a representation system
Note 1 to entry: “Viewpoint” is different from “Camera (parameters)” in that it only includes the camera position
and pose, where as the “Camera” has more information such as the internal camera parameters (3.1.1) including
the focal length, field of view, near/far planes and skew parameters.
3.1.5
virtual image
image at an arbitrary viewpoint (3.1.4) that is generated by collecting visible photo information from
real images
3.1.6
world reference frame
coordinate reference frame used to express positions/orientations of the camera, 3D models and other
objects for a given scene
3.2 Abbreviated terms
IBR image-based rendering
XML extensible markup language
XSD XML schema definition
4 Domain and concepts
4.1 General
This clause describes the domain and key concepts. This includes the domain of image-based
representation of objects/environments and overall description of its details.
4.2 Domain
Geometric and optionally photometric information is essential when representing target objects/
environments as is done in ISO/IEC 19775-1. In this document, an alternative representation
method, image-based, is pursued in which the objects/environments are represented only using the
photogrammetric information (e.g. set of images and other related information) and optionally the
underlying 3D geometric information.
Therefore, the image-based representation of objects/environments first requires a set of images and
their photogrammetric information of the target objects/environments taken at various locations, and
the 3D model represented using vertices and faces and the association between these two (if needed).
Figure 1 shows the overall scene as represented a set of images (below), which have association to the
3D model (top). Note that with photogrammetric information alone, it is possible to render the target
objects/environments from an arbitrary viewpoint by using and blending the sampled images around.
2
  © ISO/IEC 2022 – All rights reserved

---------------------- Page: 7 ----------------------
ISO/IEC 23488:2022(E)
The association to the underlying 3D model, if it exists, can make the rendering output more accurate
by helping to resolve visibility issues and registration errors.
Figure 1 — Example of a 3D model and image set for target environment representation
To be more specific, because there are multiple blind spots in a wide and large environment that a
single image cannot display, a combination of images is necessary to fully represent the whole target
objects/environments without any such blind spots. Thus, to accurately associate the 3D model with
the image set, each face of the 3D model is verified to be visible or not in each image of the image set.
Then, the target objects/environments can be represented at an arbitrary viewpoint using only the
[6],[7]
visible image patches from different images . Specifically, the colour value of each face of the 3D
model is determined from the corresponding visible image patches. After that, the determined colour
values can be blended into a single virtual image to represent the target objects/environments at any
arbitrary viewpoint.
In this regard, this document concerns the basic image-based objects/environments representation
approach that uses only the images (or photogrammetric information) but also the usage and
association of multiple images to the underlying 3D model for resolving the visibility of each face from
where the images are taken and thereby making the rendering system more accurate and efficient.
If the 3D model is used along with the image-based one, the resulting virtual or mixed reality world
can be not only more photorealistic, but with the 3D depth information, more interesting interaction
becomes possible.
4.3 Concepts
A key part of image-based representation is representing the photogrammetric information of target
objects/environments which is the basis of the proposed information model. Then the next is the
optional part as how to associate each face of the 3D model with visible image patches of the image set.
By associating each face with visible image patches, the photometric information of the visible image
patches can be accurately overlaid to each face of 3D model.
For this purpose, visibility of the 3D model’s vertex/face at each image, which can be acquired using a
[4]
visibility/occlusion detection algorithm like z-buffering , is first computed ahead of time (and added
to the representation). Regarding visibility detection, the vertices/faces can be tessellated to improve
visibility detection results in the case that the 3D model consists of sparse vertices.
3
© ISO/IEC 2022 – All rights reserved

---------------------- Page: 8 ----------------------
ISO/IEC 23488:2022(E)
Then, with the computed visibility information, objects/environments can be represented by
connecting each face of the 3D model with the corresponding visible image patches. If a face has
multiple visible image patches that can be mapped as texture, the photometric information of the face
can be computed by, for example, selecting the best image patch among the candidates or by weighting
the photometric information of multiple visible image patches. For the former case, to select one source
image for the given face, the distance to the position where the image is captured, the area of the image,
the texture quality (how well focused or blurriness) of the image can be considered. For the latter case,
the aforementioned criteria can be used to derive relative weights in blending them into a virtual image
texture as seen from an arbitrary viewpoint (see Figure 2).
4.4 Basic components
4.4.1 General
This subclause describes details of image-based representation. This includes explanation of the two
key elements of image-based representation (image set and underlying 3D model), and their integration
as in Figure 2.
4.4.2 Image set
4.4.2.1 General
An image set consists of images capturing the photometric information of target objects/environments
at different locations. The images can be taken by different entities using different hardware, allowing
the usage of virtual and real images. Also, to project the images to a 3D model, it shall be augmented
with the external parameters (rotation and translation) and the camera’s internal parameters (focal
length, resolution, aspect ratio, etc.). Using the external and internal parameters, a vertex or face of the
3D model can be projected onto the image and visibility information can be acquired, all of which are
used to visualize a 3D model with multiple textures at an arbitrary viewpoint.

Visibility should be changed according to changing viewpoint and visibility of a candidate patch considered for the
[4]
final image blend would be determined by a visibility/occlusion algorithm like z-buffering and back-face culling.
Figure 2 — Abstraction of image-based representation
4.4.2.2 Image
An image contains the photometric information of the target objects/environments. The photometric
information can be depicted using any image format, including PNG, JPG, JPEG and TIFF. The photometric
information also includes a colour model, which can be represented as RGB, HSL, or HSV. To acquire an
image, there shall be no limits on the camera model, including pinhole, fisheye, cubemap, spherical and
4
  © ISO/IEC 2022 – All rights reserved

---------------------- Page: 9 ----------------------
ISO/IEC 23488:2022(E)
cylindrical models, as long as a projection model between the image and the 3D model can be formed.
Further description of the projection model is given in 4.4.4.3.
4.4.2.3 Camera external parameters
Camera external
...

Questions, Comments and Discussion

Ask us and Technical Secretary will try to provide an answer. You can facilitate discussion about the standard in here.