ISO/IEC TR 20748-1:2016
(Main)Information technology for learning, education and training - Learning analytics interoperability - Part 1: Reference model
Information technology for learning, education and training - Learning analytics interoperability - Part 1: Reference model
ISO/IEC TR 20748-1:2016 specifies a reference model that identifies the diverse IT system requirements of learning analytics interoperability. The reference model identifies relevant terminology, user requirements, workflow and a reference architecture for learning analytics.
Technologies pour l'éducation, la formation et l'apprentissage — Interopérabilité de l'analytique de l'apprentissage — Partie 1: Modèle de référence
General Information
Overview
ISO/IEC TR 20748-1:2016 defines a reference model for learning analytics interoperability within the domain of information technology for learning, education and training (LET). The Technical Report lays out common terminology, stakeholder user requirements, representative use cases and a workflow-derived reference architecture that describes how learning analytics components (data sources, collection, storage, processing, analysis, visualization and feedback) should interoperate. The aim is to help organizations design systems that share learner- and course-related data reliably, ethically and at scale.
Key Topics
- Reference model & terminology – standardized definitions for learning analytics concepts (e.g., dashboard, data flow, data source, learning outcome).
- User requirements & use cases – requirements gathered from learners, teachers and institutions (tracking progression, early-warning, personalized recommendations, quality assurance).
- Workflow & architecture components – clear decomposition into processes such as learning activity, data collection, storing/processing, analysing, visualization and feedback.
- Data flow & exchange – considerations for data formats, APIs (including abbreviated terms like xAPI), storage and archival to enable interoperability.
- Analytics techniques – support for descriptive and predictive analytics, social network and discourse analytics, and dashboard-driven visualization.
- Operational constraints – attention to volume, velocity and variety of analytics data (big data implications), quality of service and scalable IT architectures.
- Ethics, privacy & accessibility – guidance on privacy, trust, control of learner data and accessibility preferences to ensure inclusive, responsible analytics.
Applications
ISO/IEC TR 20748-1 is practical for anyone designing or operating learning analytics solutions:
- EdTech vendors & LMS/VLE developers – to design interoperable analytics modules and dashboards.
- System architects & integrators – for specifying APIs, data pipelines and storage that meet analytics needs.
- Data engineers & analysts – to map data sources, formats and workflows for analytics pipelines.
- Institutions & administrators – to plan retention strategies, quality assurance and institutional reporting driven by interoperable analytics.
- Educators & instructional designers – to implement adaptive learning, early-warning systems and personalized learning pathways.
- Policy-makers & privacy officers – to align analytics practice with ethical and accessibility requirements.
Practical benefits include improved data portability across platforms, clearer requirements for analytics pipelines, faster deployment of dashboards and predictive services, and better protection of learner rights.
Related Standards
- ISO/IEC 20748 series (other parts addressing interoperability)
- ISO/IEC 2382, ISO/IEC 24751-1, ISO/IEC 17027, ISO 9241-11, ISO/TS 29585 (referenced terminology and usability sources)
Using ISO/IEC TR 20748-1 helps organizations create interoperable, scalable and ethically responsible learning analytics ecosystems.
Standards Content (Sample)
TECHNICAL ISO/IEC TR
REPORT 20748-1
First edition
2016-12-15
Information technology for learning,
education and training — Learning
analytics interoperability —
Part 1:
Reference model
Technologies pour l’éducation, la formation et l’apprentissage —
Interopérabilité de l’analytique de l’apprentissage —
Partie 1: Modèle de référence
Reference number
©
ISO/IEC 2016
© ISO/IEC 2016, Published in Switzerland
All rights reserved. Unless otherwise specified, no part of this publication may be reproduced or utilized otherwise in any form
or by any means, electronic or mechanical, including photocopying, or posting on the internet or an intranet, without prior
written permission. Permission can be requested from either ISO at the address below or ISO’s member body in the country of
the requester.
ISO copyright office
Ch. de Blandonnet 8 • CP 401
CH-1214 Vernier, Geneva, Switzerland
Tel. +41 22 749 01 11
Fax +41 22 749 09 47
copyright@iso.org
www.iso.org
ii © ISO/IEC 2016 – All rights reserved
Contents Page
Foreword .iv
Introduction .v
1 Scope . 1
2 Normative references . 1
3 Terms and definitions . 1
4 Abbreviated terms . 3
5 Use cases and practices . 3
5.1 General . 3
5.2 Learning analytics . 4
5.3 Assessment . 4
5.4 Data flow and data exchange . 4
5.5 Accessibility preferences . 5
6 Reference model for learning analytics interoperability . 5
6.1 General . 5
6.2 Workflow for general data analytics . 5
6.3 Reference architecture derived from workflow and use cases . 6
6.3.1 General. 6
6.3.2 Learning and teaching activity process . 7
6.3.3 Data collection process. 8
6.3.4 Data storing and processing process . 9
6.3.5 Analysing process .10
6.3.6 Visualization process .11
6.3.7 Feedback process .12
Annex A (informative) Use cases and practices .15
Bibliography .31
© ISO/IEC 2016 – All rights reserved iii
Foreword
ISO (the International Organization for Standardization) and IEC (the International Electrotechnical
Commission) form the specialized system for worldwide standardization. National bodies that are
members of ISO or IEC participate in the development of International Standards through technical
committees established by the respective organization to deal with particular fields of technical
activity. ISO and IEC technical committees collaborate in fields of mutual interest. Other international
organizations, governmental and non-governmental, in liaison with ISO and IEC, also take part in the
work. In the field of information technology, ISO and IEC have established a joint technical committee,
ISO/IEC JTC 1.
The procedures used to develop this document and those intended for its further maintenance are
described in the ISO/IEC Directives, Part 1. In particular the different approval criteria needed for
the different types of document should be noted. This document was drafted in accordance with the
editorial rules of the ISO/IEC Directives, Part 2 (see www.iso.org/directives).
Attention is drawn to the possibility that some of the elements of this document may be the subject
of patent rights. ISO and IEC shall not be held responsible for identifying any or all such patent
rights. Details of any patent rights identified during the development of the document will be in the
Introduction and/or on the ISO list of patent declarations received (see www.iso.org/patents).
Any trade name used in this document is information given for the convenience of users and does not
constitute an endorsement.
For an explanation on the meaning of ISO specific terms and expressions related to conformity assessment,
as well as information about ISO’s adherence to the World Trade Organization (WTO) principles in the
Technical Barriers to Trade (TBT) see the following URL: www.iso.org/iso/foreword.html.
The committee responsible for this document is ISO/IEC JTC 1, Information technology, SC 36, Information
technology for learning, education and training.
A list of all parts in the ISO/IEC 20748 series, published under the general title Information technology for
learning, education and training — Learning analytics interoperability, can be found on the ISO website.
iv © ISO/IEC 2016 – All rights reserved
Introduction
The increasing amount of data being generated from learning environments provides new opportunities
to support learning, education and training (LET) in a number of new ways through learning analytics.
Learning analytics is a composite concept built around the use of diverse sub-technologies, workflows
and practices and applied to a wide range of different purposes. For instance, learning analytics is
being used to collect, explore and analyse diverse types and interrelationships of data, such as: learner
interaction data related to usage of digital resources; teaching and learning activity logs; learning
outcomes and structured data about programmes; curriculum and associated competencies.
Learning analytics is an emerging technology addressing a diverse group of stakeholders and covering
a wide range of applications. Learning analytics raises new interoperability challenges related to data
sharing; privacy, trust and control of data; quality of service, etc. Through use case collection in the ad-
hoc group on learning analytics interoperability, established under JTC1/SC36 in 2014, the following
issues were identified and captured as general requirements for learning analytics applications:
For the learner:
— tracking learning activities and progression;
— tracking emotion, motivation and learning-readiness;
— early detection of learner’s personal needs and preferences;
— improved feedback from analysing activities and assessments;
— early detection of learner non-performance (mobilizing remediation);
— personalized learning path and/or resources (recommendation).
For the teacher:
— tracking learners/group activities and progression;
— adaptive teacher response to observed learner’s needs and behaviour;
— early detection of learner disengagement (mobilizing relevant support actions);
— increasing the range of activities that can be used for assessing performance;
— visualization of learning outcomes and activities for individuals and groups;
— providing evidence to help teacher improve the design of the learning experience and resources.
For the institution:
— tracking class/group activities and results;
— quality assurance monitoring;
— providing evidence to support the design of the learning environment;
— providing evidence to support improved retention strategies;
— support for course planning.
In addition, learning analytics practice can build upon prior work in LET standardization and innovation
but there are several factors that require special attention. These factors include:
— requirements arising from the analytical process;
— data items required to drive operational LET systems are not always the same as desired for learning
analytics;
© ISO/IEC 2016 – All rights reserved v
— volume, velocity and variety of the data collected for analytics indicate different IT architectures,
which imply different interoperability requirements;
— use of learner data for analytics introduces a range of ethical and other socio-cultural issues beyond
those which arise from exchanging data between operational systems.
Therefore, this document gives a conceptual description of the behaviour of components related to
learning analytics interoperability. In particular, this document specifies terms as well as proposes a
reference model for the learning analytics process and interoperability.
vi © ISO/IEC 2016 – All rights reserved
TECHNICAL REPORT ISO/IEC TR 20748-1:2016(E)
Information technology for learning, education and
training — Learning analytics interoperability —
Part 1:
Reference model
1 Scope
This document specifies a reference model that identifies the diverse IT system requirements of learning
analytics interoperability. The reference model identifies relevant terminology, user requirements,
workflow and a reference architecture for learning analytics.
2 Normative references
The following documents are referred to in the text in such a way that some or all of their content
constitutes requirements of this document. For dated references, only the edition cited applies. For
undated references, the latest edition of the referenced document (including any amendments) applies.
There are no normative references in this document.
3 Terms and definitions
For the purposes of this document, the following terms and definitions apply.
ISO and IEC maintain terminological databases for use in standardization at the following addresses:
— IEC Electropedia: available at http://www.electropedia.org/
— ISO Online browsing platform: available at http://www.iso.org/obp
3.1
accessibility
usability of a product, service, environment or facility by individuals with the widest range of
capabilities
Note 1 to entry: Note 1 to entry: Although “accessibility” typically addresses users who have a disability, the
concept is not limited to disability issues.
[SOURCE: ISO/IEC 24751-1:2008, 2.2]
3.2
assessment
means of measuring or evaluating learner understanding or competency
3.3
dashboard
user interface based on predetermined reports, indicators and data fields, upon which the end user can
apply filters and graphical display methods to answer predetermined business questions and which is
suited to regular use with minimal training
[SOURCE: ISO/TS 29585:2010, 3.3]
© ISO/IEC 2016 – All rights reserved 1
3.4
data analysis
systematic investigation of the data and their flow in a real or planned system
[SOURCE: ISO/IEC 2382:2015, 2122686]
3.5
data collection
process of bringing data together from one or more points for use in a computer
EXAMPLE EXAMPLE To collect transactions generated at branch offices by a data network for use at
a computer centre.
[SOURCE: ISO/IEC 2382:2015, 2122166]
3.6
data exchange
storing, accessing, transferring, and archiving of data
[SOURCE: ISO 10303-1:1994, 3.2.15]
3.7
data flow
movement of data through the active parts of a data processing system in the course of the performance
of specific work
[SOURCE: ISO/IEC 2382:2015, 2121825]
3.8
data format
arrangement of data in a file or stream
[SOURCE: ISO/IEEE 11073-10201:2004, 3.14]
3.9
data source
functional unit that provides data for transmission
[SOURCE: ISO/IEC 2382:2015, 2124348]
3.10
individual
human being, i.e. a natural person, who acts as a distinct indivisible entity or is considered as such
[SOURCE: ISO/IEC 24751-1:2008, 3.21]
3.11
learning analytics
measurement, collection, analysis and reporting of data about learners and their contexts, for purposes
of understanding and optimizing learning and the environments in which it occurs
3.12
learning platform
integrated set of (online) services that provide learner, teacher and/or others involved in learning,
education and training with information, tools and resources to support and enhance educational
delivery and management
2 © ISO/IEC 2016 – All rights reserved
3.13
learning outcome
what a person is expected to know, understand or be able to do at the end of a training programme,
course or module
[SOURCE: ISO/IEC 17027:2014, 2.57]
3.14
usability
extent to which a product can be used by specified users to achieve specified goals, with effectiveness,
efficiency and satisfaction, in a specified context of use
[SOURCE: ISO 9241-11:1998, 3.1]
3.15
workflow
depiction of the actual sequence of the operations or actions taken in a process
Note 1 to entry: Note 1 to entry: A workflow reflects the successive decisions and activities in the performance
of a process.
[SOURCE: ISO 18308:2011, 3.52]
4 Abbreviated terms
ADL advanced distributed learning
AFA access-for-all
API application programming interface
ICT information and communication technologies
LET learning, education and training
LMS learning management system
LOD linked and open data
PLE personal learning environment
VLE virtual learning environment
xAPI experience API
5 Use cases and practices
5.1 General
Use cases were collected from national bodies and liaison organizations of ISO/IEC JTC1/SC36. The use
cases illustrate key functionalities related to learning analytics by focusing on particular requirements
that stakeholders may have and then outlining how such requirements can be reflected in workflows
for learning analytics. A total of fifteen use cases were received in 2014.
Use cases considered four main areas:
— learning analytics;
— assessments;
© ISO/IEC 2016 – All rights reserved 3
— data flow and data exchange;
— accessibility preferences.
The summary of the use cases is presented in Clause 5. The complete list of use cases is available in
Annex A.
5.2 Learning analytics
A stakeholder has previous experience with analytics dashboards available in online learning platforms
(known as learning management systems (LMS) or virtual learning environments (VLE)). In general,
data logs were not in a format that non-technical users could interpret, but these are now rendered
(displayed) via a range of graphs, tables and other visualization forms, and custom reports designed
for learners, educators, administrators and data analysts. Learners may get basic analytics from
dashboards such as progress relative to the cohort average marks or engagement ratio.
Learning analytics are delivered with more advanced features, namely predictive analytics. Predictive
analytics focuses on the pattern of learners’ static data (e.g. demographics; past attainment) and
dynamic data (e.g. pattern of online logins; quantity of discussion posts). Once a student’s trajectory is
drawn (e.g. “at risk”; “high achiever”; “social learner”), timely interventions can be planned (e.g. offering
extra social and academic support; presenting more challenging tasks).
Learning analytics are used to enhance the personalized learning environment (PLE). Based on
learning analytics output, the PLE can recommend learning pathways combined with learning content
or resources. This service model enables fine-grained feedback (e.g. which concepts have been grasped
and at what level), and adaptive presentation of content (e.g. not showing material that depends on the
mastery of concepts that the learner is yet to acquire).
Other types of learning analytics are social network analytics and discourse analytics. Social network
analysis makes visible the structures and dynamics of interpersonal networks to understand how
people develop and maintain these relations (in the classroom or learning community). Discourse
analytics requires the use of sophisticated technology to assess the quality of text in order to scaffold
the higher-order thinking and writing skills that we seek to instil to learners.
5.3 Assessment
One of the advantages of using ICT in assessment is to improve precision in evaluating individual
learning in order to provide input to (adaptive) learning systems. Learning analytics are useful for
monitoring how students are going about learning and solving problems. This can be achieved by
embedding learning assessments within the learning experience and analysing process data in log files
that capture every click and keystroke. It is important to note that embedded assessments do not need
to be hidden assessments. Feedback and recommendations from the analytics platform can be highly
motivating, showing learners where they should focus their attention and learning efforts along with
highlighting their accomplishments.
5.4 Data flow and data exchange
Increasingly, many institutions are requiring interoperable data formats and exchange mechanisms
that simplify the process of collecting and delivering learning data to and from digital learning
environments. This is being driven by the proliferation of heterogeneous data generated
from learning systems and applications. The Experience API (xAPI, see https://www.adlnet.
gov/adl-research/performance-tracking-analysis/experience-api) and Caliper Analytics (Caliper, see
http://www.imsglobal.org/activity/caliperram) are identified as potential standards applicable to
stakeholders. The implication from xAPI and Caliper in terms of interoperability standards is that it
is necessary to standardize profiles for presenting learning data as well as APIs implementing data
capture.
One of the most important things in learning analytics is data control by the individual of his or her
personal information (e.g., as a learner), including options such as “do not track” or “data chrono-
4 © ISO/IEC 2016 – All rights reserved
degradability”. One of the cases describes an approach to giving the learner (or his/her parents) control
over the data of that individual as a learner in a school setting. The use case follows the learner from
registering at a school, to moving to another school, with interactions with the school (and through the
school with suppliers of services, e.g., publishers). Other important issues with data control are privacy
and identification of people through identity federation. Most use cases have similar privacy issues and
this implies the privacy requirements and related technology should be a fundamental component of
any learning analytics. Applying privacy requirements such as anonymization and pseudonymization
should be reflected into learning analytics.
Learner activity data may be generated from a wide variety of platforms, including but not limited to
web-based applications, desktop computers, mobile devices, wearable technologies and the internet
of things. These tracking data may be used in portfolio services. As described in 5.2, diverse types of
learning data can be reflected for each learner’s learning activity and progress. The portfolio service
is not limited to curating and showcasing learner’s output, but also to diagnosis of strengths or
weaknesses in learning contexts. Many portfolio services focus on the display of learner content and
self-reflection by leaners. However, improved portfolio services, based on learning analytics, will show
multidimensional perspectives of learner performance and activity data.
5.5 Accessibility preferences
Dashboards provide a general way to present analytics information. This category of use cases
describes how a dashboard should be presented flexibly and filtered by purposes. One of the use cases
introduces scenarios for users (e.g. learner, teacher and module manager) with accessibility needs and
preferences. Learning analytics enables teachers or administrators to deliver effective learning with
accessibility needs being met, supported by data generated from the learning analytics. An example is
supporting the needs of a learner with a vision-impairment through resources that have proven to be
highly effective with learners with similar needs.
Another scenario related to accessibility is focused on early detection via learning analytics, supporting
diagnostic testing for impairments, such as visual or hearing impairments, auditory processing disorder
(APD), dyscalculia or dyspraxia; and provide remediation or support. Accessibility preferences may be
stored in the cloud to deliver seamless service across diverse devices.
6 Reference model for learning analytics interoperability
6.1 General
In Clause 6, a preliminary reference model for learning analytics is introduced by detailing the set of
processes and relationships between them that is formulated from the collected use cases provided in
Annex A.
A workflow of general data analytics is presented in 6.2. The general data analytics workflow is
extended and transformed into a loop by adding teaching and learning activities to the workflow as
noted in 6.3. Additional details regarding the key elements of the reference architecture are provided
in the sub-clauses of 6.3.
6.2 Workflow for general data analytics
The goal of learning analytics is to understand and improve learning and its environment and
encompasses the tasks of measurement, collection, analysis and reporting of data about learners and
the learning, education and training (LET) contexts in which learning occurs. These tasks closely
match the workflow of data analytics as shown in Figure 1. Such correspondence is not coincidental but
suggests that learning analytics can take advantage of the technological advancement of data analytics
in building a learning analytics framework.
© ISO/IEC 2016 – All rights reserved 5
Figure 1 — Workflows for general data analytics
6.3 Reference architecture derived from workflow and use cases
6.3.1 General
There are a total of six processes in the learning analytics workflow that are supported by privacy
and data protection requirements, as noted in Figure 2. Although learning analytics is primarily based
on data collection and analysis, learning and teaching activities within LET contexts are fundamental
to the whole process and need to be considered in order for a feedback loop to be enabled. Learning
and teaching activities provide sources for data collection and subsequent processes of the learning
analytics workflow.
Figure 2 — Abstract workflow of learning analytics
The six processes that comprise the learning analytics workflow (Figure 2) are:
— Learning and teaching activity: data modelling sources of learning activities in order to decide
upon learning activity data that could be used for analytics, and the release of learning activity data
for data collection.
— Data collection: gathering andhttps://en.wikipedia.org/wiki/Measuringmeasuring information
on variables of interest in the learning and teaching activities.
— Data processing and storing: preparing and storing data from diverse and heterogeneous data
sources for interoperable data analysis by utilizing the standardized data model and representation.
— Analysing: systematic investigation of learning data by inspecting and modelling the learning data
with the goal of producing descriptive and possibly predictive knowledge.
6 © ISO/IEC 2016 – All rights reserved
— Visualization: creating representations of abstract data, including text and schematic representations
such as social diagrams and maps, to allow stakeholders to see, explore, interact and understand
large amounts of information in analysing and reasoning about data and evidence.
— Feedback and recommendation: serving the results of a cycle of learning analysis back to the
learners and their contexts so that corrective actions can be taken.
As illustrated in Figure 3, input data items can be obtained from a variety of learning and teaching
activities. As well, the outputs from the learning analytics workflow can provide feedback and
recommendations to inform improvements to learning and teaching activities.
Figure 3 — Reference workflow of learning analytics
For each of the six learning analytics workflow processes, the subclauses in 6.3 provide more detailed
information as to how requirements identified from the use cases can be met and implemented. This
includes the use of mandatory actions, general considerations and optional actions. These specific
processes are not be considered as fully indicative or prescriptive, as new stakeholders’ needs may
result in adjustment and addition of actions.
6.3.2 Learning and teaching activity process
Learning and teaching activity within LET contexts is the starting point for learning analytics, and
learning activities are the source of data collection process. In general, learning activity is performed
within heterogeneous environments, using a mixture of tools. The learning and teaching activity
process regulates either data release as well as data modelling or profiling to be able to generate
learning activity data, which can be used for analytics. Possible flows of data among learning and
teaching activity and data collection process involve the following aspects:
— Data modelling (see Figure 4) is guided by pedagogical questions outlining what aspect of learning
should be supported by learning analytics, such as learning outcome, learning progress and attitude,
student retention and development of specific cognitive skills.
— When the required data sources are identified, issues related to the release of the data are addressed.
These issues may include consent from the data subject, e.g., the learner, conditions for release
© ISO/IEC 2016 – All rights reserved 7
given by data protection and privacy laws, etc. – issues described in privacy and data protection
requirements (Figure 2).
Figure 4 — Zoom-in diagram for learning and teaching activity
6.3.3 Data collection process
Data collection is the process of gathering and measuring information on variables of interest in
learning and teaching activities as shown in Figure 5. In this process some features, such as authority
and control of data source, interoperability of data, and efficiency of flow and exchange, are required
for a system to work. Possible flows of data among data collection and data storing and processing
process involve the following aspects:
— Learning and teaching activities and related data sources such as learning devices, software
applications, and social networks produce various data. The sources include lectures, learning
materials, learning tools, quizzes and assessments, discussion forums, messages, social networks,
homework, prior credit, achievements, system logs, sensors, etc. These learning data need to be
collected or converted to data API specifications such as Experience API (xAPI), and IMS Caliper
Analytics (Caliper).
— The data collected from LET activities, which contains information about learners, is subject to
privacy protection requirements. Here it is important that those collecting such information about
learners do so with the informed consent of the learner (or its parent or legal guardian). Further, the
personal data shall only be used for the goal agreed to and must be protected by necessary means
such as encryption, de-identification, chrono-degradability, pseudomization and anonymization.
— Data collection APIs yields data collection instances, possibly via secured data transmission.
— Data collection processes may be subject to conformance testing or conversion prior to storing data
in the temporary data store, such as the event store in IMS Caliper or learning record store in xAPI,
to be used in later processing.
8 © ISO/IEC 2016 – All rights reserved
Figure 5 — Zoom-in diagram for data collection
6.3.4 Data storing and processing process
Data storing and processing is the process of preparing and storing data from diverse and
heterogeneous data sources for interoperable data analysis by utilizing the standardized data model
and representation as shown in Figure 6. Possible flows of data among data storing and processing and
analysing process involve the following aspects:
— The learning data stored in temporary data store are processed by the data translator and filter.
The processed results are stored into analytics data store.
— The data translator and filter process may have a unified data translator that translates various
data in heterogeneous representations into a uniform representation, such as linked and open data
(LOD), by applying explicit translation rules, for an efficient and interoperable analysis process.
— A general-purpose data filter may be applied to the translation process driven by the filtering
conditions to clean and transform the data.
— One of the main sources of data includes discourse, writing, conversation, and communicative
events. Such data may need to be processed by natural language processing before the results are
in turn translated into a uniform representation.
— The data stored in temporary data store may be accessed via a standardized data query interface,
and the processed data may be stored to analytics data store via a standardized data migration.
© ISO/IEC 2016 – All rights reserved 9
Figure 6 — Zoom-in diagram for data storing and processing
6.3.5 Analysing process
Analysing is the process of systematic investigation of learning data by inspecting, and modelling the
learning data with the goal of producing descriptive and possibly predictive knowledge as shown in
Figure 7. Possible flows of data among analysing and visualization process involve the following aspects:
— As well as the micro data stored in the analytics data store, general domain data such as curricula,
learning resources, and preferences may be stored in constant information to be utilized by the data
analysis.
— Privacy concerns exist wherever information about learners (or otherhttp://en.wikipedia.org/wiki/
Information_sensitivitysensitive information) is collected and stored. Learning data analysis is not
an exception.
— Various external analysis algorithms such predictive analytics, adaptive analytics, learning
disabilities, discourse analytics, and other assessments using ICT are applied via analysis interface.
— Analysis processing may consist of statistical analysis, topic analysis, network analysis, and social
analysis as the low-level front-end analysis. The results of low-level analysis then may feed into
pattern learning, dynamic modelling, and association analysis before they are used by dashboard
integration, content recommendation, and learning path recommendation.
— The analysis results may be refined by the data manipulation interface and then stored into the
analytics data store for further analysis cycles or later processing steps such as the visualization
process.
10 © ISO/IEC 2016 – All rights reserved
Figure 7 — Zoom-in diagram for analysing
6.3.6 Visualization process
As mentioned in 6.3.1, visualization is the process of creating representation of abstract data including
text and geographic information to allow users to see, explore, interact, and understand large amounts
of information in analysing and reasoning about data and evidence as shown in Figure 8. A primary
goal of visualization is to communicate information clearly and efficiently to users via the statistical
graphics, plots, information graphics, tables, and charts selected, and thus makes complex data more
accessible, understandable and usable. Possible flows of data among visualization and feedback process
involve the following aspects:
— The data in analytics data store may be accessed by the visualization process via data query interface
— Visual representation for learning analytics may include dashboard information, ePortfolio, social
diagrams, learning path and resources.
— The dashboard information may show comparisons or progress, recommendations, and real-time
assessments, topic-based assessment, social-network graph, etc.
— Data Interface may provide an open data interface to the external dashboard and reporting system to
provide feedback information such as personalization, intervention or prediction for individual users
© ISO/IEC 2016 – All rights reserved 11
Figure 8 — Zoom-in diagram for visualization
6.3.7 Feedback process
Feedback actions (see Figure 9) serve the results of a cycle of learning analysis back to the learners
and their contexts so that corrective actions can be taken. The feedback process of LA is concerned
with mediation of the knowledge gleaned from the data. Not only are learners on the receiving end but
also course designers, developers of learning materials, teachers and administrators, etc. are party to
learning analytics. Possible flows of data from learning and teaching activity to the feedback process
involve the following aspects:
— For feedback action to learners or other stakeholders, the data set analysed needs to be flowed from
reporting system using open data interface.
— Statistical information can be used as feedback action to control pace of learning, change attitude or
pattern, compare with peer group, or recognize position of themselves within social diagrams.
— Feedback action based on prediction may be applied for learners, teachers and other stakeholders.
For instance, learner trajectories using captured learning data can be used to identify learners at-
risk, high achievers, or social learners.
— Feedback actions enable adaptive learning environments, which may provide non-linear pathways
pertaining to personalized digital resources, content or self-assessment.
12 © ISO/IEC 2016 – All rights reserved
Figure 9 — Zoom-in diagram for feedback
© ISO/IEC 2016 – All rights reserved 13
14 © ISO/IEC 2016 – All rights reserved
Annex A
(informative)
Use cases and practices
A.1 Learning analytics
Use cases in A.1 illustrate key functionalities related to learning analytics by focusing on particular
requirements that stakeholders may have and then outlining how such requirements can be reflected
to work flows for learning analytics.
— Use Case I-01. Analytics dashboards on LMS/VLE
Contributor (name) Yong-Sang Cho (zzosang@keris.or.kr) and Jing DU (dujing@tsinghua.edu.cn)
Source 1. UNESCO Policy Brief: http://iite.unesco.org/publications/3214711/
2. KERIS report (written in Korean): http://goo.gl/CgPLGu
(name or url)
3. Learning Analysis Dashboard in MOOC and SPOC: http://www.xuetangx.com
Main stakeholders For learner and teacher
Description Analytics dashboards are found in most online learning platforms known as learn-
ing management system (LMS) or virtual-learning platform (VLE). In general,
data logs were not in a format that non-technical users could interpret, but these
are now rendered (displayed) via a range of graphs, tables and other visualization
forms, and custom reports designed for learners, educators, administrators and
data analysts. Currently, learning platforms record data logs related to engagement
of learners such as login time and count of logins, number of participations in dis-
cussion forums, and whether assignments are completed or not. More advanced
analytics platforms will integrate data from other learning platforms, software
and content (See this case in use case III-01).
Learners may get basic analytics from dashboards such as progress relative to the
cohort average marks, and how level of engagement in learning activity designed
by the teacher. Some institutions are going further, and add information using
visualization products to assist interpretation of complex data extracted from a
student information system and school information system.
LMS/VLE vendors provide examples and webinars about their analytics dashboards,
and the enterprise analytics vendors are contextualizing their products to the
education market. A very useful compendium of higher-education case studies is
being compiled by EDUCAUSE, e.g. Arizona State University reports that it is seeing
returns on its investment in academic and learning analytics, including a “Student
360” program that integrates all that the institution knows about a student.
Notes and/or issues Even if LMS/VLE just use its log data (without third party’s logs) related to stu-
dent’s activity to display dashboards, the data should be approved or agreed by
student and don’t be used for other purposes that did not agree with service license
agreement (SLA) between vender/institution and student.
There are two different ways (opt-in and opt-out) to gather privacy information
generated through learning activity such as messages, content on forum, log-in
time, area, device and count, etc.
© ISO/IEC 2016 – All rights reserved 15
— Use Case I-02. Predictive Analytics using trajectory data
Contributor (name) Yong-Sang Cho (zzosang@keris.or.kr)
Source 1. UNESCO Policy Brief: http://iite.unesco.org/publications/3214711/
2. KERIS report (written in Korean): http://goo.gl/CgPLGu
(name or url)
Main stakeholders For learner, teacher and parent
Description Predictive analytics focuses on the pattern of learners’ static data (e.g. demo-
graphics; past attainment) and dynamic data (e.g. pattern of online logins; quan-
tity of discussion posts). Once a student’s trajectory is drawn (e.g. “at risk”; “high
achiever”; “social learner”), timely interventions can be planned (e.g. offering extra
social and academic support; presenting more challenging tasks). Currently, one of
the most reliable predictors of final exam results is still exam performance at the
start of studies. The design of more complex data-driven predictive models must
clearly improve on this, but require statistical analysis to identify those variables
in the data that can be historically validated as being the strongest predictors of
‘success’. While at present these are most commonly defined as assignment/exam
outcomes, the debate about assessment regimes (see below) draws attention to the
role that analytics could play in providing formative feedback and the building of
horizontal/transferable skills.
Work at Purdue University on the Course Signals software is well known, and the
technology is available as a product. Signals provides a red/amber/green light to
students on their progress. Their most-recent evaluation reports: “Results thus
far show that students who have engaged with Course Signals have higher average
grades and seek out help resources at a higher rate than other students.” University
of Michigan report promising results with physics students from their E2Coach
infrastructure which adapts personalized (open source) intervention technology
from validated health informatics research, to give customized feedback and
motivate students to change their strategies. Paul Smith’s college used Starfish
EarlyAlert to integrate staff feedback on students, and Rapid Insight tools to build
an accurate predictive model for identifying at-risk students.
Models may be context-specific to the particular institution, culture, level of study,
discipline, etc., or (most excitingly) may prove robust enough for general use. The
Predictive Analytics Reporting (PAR) Framework, developed and piloted with six
US educational institutions, seeks to identify patterns in their collective student
data. Initial results report a significant correlation between disenrollment and the
number of concurrent courses in which students were enrolled. These approaches
are designed for generic learning environments, agnostic to subject matter, but if
one constrains the scope to a specific topic, new kinds of analytics are possible.
Notes and/or issues There are same issues described in use case I-01.
In particular, prior to getting trajectory data institution and vender should notice
what kind of data they will gather and how they will push the alert to the student
via a SLA. In this case it seems opt-in appropriate for the service.
16 © ISO/IEC 2016 – All rights reserved
— Use Case I-03. Personalized learning environments with digital resources
Contributor (name) Yong-Sang Cho (zzosang@keris.or.kr)
Source 1. UNESCO Policy Brief: http://iit
...
Frequently Asked Questions
ISO/IEC TR 20748-1:2016 is a technical report published by the International Organization for Standardization (ISO). Its full title is "Information technology for learning, education and training - Learning analytics interoperability - Part 1: Reference model". This standard covers: ISO/IEC TR 20748-1:2016 specifies a reference model that identifies the diverse IT system requirements of learning analytics interoperability. The reference model identifies relevant terminology, user requirements, workflow and a reference architecture for learning analytics.
ISO/IEC TR 20748-1:2016 specifies a reference model that identifies the diverse IT system requirements of learning analytics interoperability. The reference model identifies relevant terminology, user requirements, workflow and a reference architecture for learning analytics.
ISO/IEC TR 20748-1:2016 is classified under the following ICS (International Classification for Standards) categories: 03.100.30 - Management of human resources; 03.180 - Education; 35.240.90 - IT applications in education; 35.240.99 - IT applications in other fields. The ICS classification helps identify the subject area and facilitates finding related standards.
You can purchase ISO/IEC TR 20748-1:2016 directly from iTeh Standards. The document is available in PDF format and is delivered instantly after payment. Add the standard to your cart and complete the secure checkout process. iTeh Standards is an authorized distributor of ISO standards.








Questions, Comments and Discussion
Ask us and Technical Secretary will try to provide an answer. You can facilitate discussion about the standard in here.
Loading comments...