Six papers were presented in the workshop. While this is not in any respect enough material for a systematic qualitative review of current research in this area, it does represent a significant collection of perspectives that helped facilitate discussion. Moreover, these papers provided sufficient material for the participants to challenge while also stimulating the participants to explore the questions arising. Contrasted with the results of a recent review by Papamitsiou and Economides (2014) of 40 papers (selected from 208 papers identified by search terms), the ICCE workshop papers identify and raise questions that contribute to the building of an “issues space”, upon which further research can proceed. When classifying research objectives, Papamitsiou and Economides found that the majority of studies in their review investigated issues related to student/student behaviour modelling and prediction of performance, followed by increase in reflection by students and teachers and awareness for the need for improvement following the provision of feedback and assessment services. As the following summary of the workshop papers shows, the initial ICCE workshop issues space appears to be more restricted, although diverse enough to indicate that the ICCE community is in an early phase of making sense of this new field of research. Key issues and questions arising are listed against each paper; however, as the questions were documented during their presentation, questions that had already been documented were typically not repeated. It is important to note that in the context of this investigation, the following summaries are not, however, intended to represent a critique of the papers.
Preliminary requirements analysis towards an integrated learning analytics system
In the first paper presented, Byung-gi Choi et al. (2014) establish the importance of conducting a requirements analysis in order to construct a united framework for developing an open and extensible learning analytics system. Such an approach is consistent with activity within the domain of international standardisation and in accordance with the theme of the workshop that highlights the role of systems interoperability. The authors report on a reference software architecture they have developed allowing them to identify structure and workflow of learning analytics systems. As participants in a Korean Ministry of Education initiative, the authors make explicit their aim at contributing to the development of international standards supporting worldwide interoperability of learning technology. The paper presents some results of a preliminary requirements analysis towards such an open and interoperable learning analytics system.
Key issues:
The key issues in this paper relate to interoperability and software architectures.
Questions arising:
-
What software tools can be used to support learning analytics (LA)?
-
What are the dimensions of learning analytics interoperability (LAI)?
-
Why is LAI important?
-
Who are the stakeholders of LA systems?
-
What kinds of issues have been identified as critically important to solve?
-
How can requirements of LAI be accurately determined?
-
What are the benefits of LA?
-
What are the benefits of LAI?
-
What needs to be standardised in order to achieve interoperability of LA systems?
LAI—looking for low-hanging fruits
Hoel and Chen (2014) provided a summary of the current status of LAI and proposed a framework to help structure the interoperability work of requirements analysis and systems scoping. The model is based on a three-dimensional Enterprise Interoperability Framework mapping concerns, interoperability barriers, and potential solutions. The paper also introduces the concept of low-hanging fruits in prioritising analysis and solutions. Data gathered from a small group of Norwegian stakeholders are analysed, and a list of potential interoperability issues is presented.
Key issues:
This paper sets out to find an approach to LA solutions that are complex enough to require that systems interoperate, but simple enough to be implemented without resistance (i.e. being a low-hanging fruit).
Questions arising:
-
What are the kinds of systems from which data can be used to support LA?
-
Who is doing technical analysis of LA requirements?
-
What kinds of commonalities exist in current LA system models?
-
What is required in order to create a service from user learning data?
-
In what ways do system requirements of LA need to be expressed?
-
How can a reference architecture for LAI be expressed?
-
In what ways do current frameworks align and differ?
Making sense of online learning behaviour: a research on learning styles and collaborative learning data
The focus of the study by Sun et al. (2014) is the relationship between learning styles, online behaviours, and group collaborations. Sixty junior students from a university in China using the Sakai course platform were examined to learn about their learning styles. The results revealed a relationship between learning styles and online collaborative behaviour. Nevertheless, the authors conclude that grouping by learning styles might not be the factor affecting group collaborations. Significantly, in framing this paper the authors make explicit four research questions, and these questions served as the ongoing focus of the presentation:
-
Which dimensions of learning style have effect on learners’ online behaviours?
-
Which kinds of online behaviours could be affected by learning style?
-
Is there a significant difference among groups’ online performances?
-
Is there a significant relationship between group members’ learning styles and groups’ online collaborative performances?
Key issues:
-
“Students with different learning style preferences showed significantly different online behaviors in some patterns.” (Sun et al. 2014, p. 268)
-
Correlation of learning style with individual and group performance warrants further research
Questions arising:
How can learning analytics fit into a general evaluation framework?
Stracke (2014) proposes the use of a generic evaluation framework for impact assessment for the purpose of determining how learning analytics can be addressed and embedded in learning design. The Evaluation Framework for Impact Measurement (EFI) combines internal and external impact assessment and provides a generic evaluation framework for learning analytics. Using the international quality standard ISO/IEC 19796-1, the paper discusses which processes and how a learning design specification can be helpful for the introduction and support of learning analytics.
Key issues:
The paper situates LA in the broader context of quality assurance and learning design, exploring whether LA could benefit from the functional principles derived in those fields.
Questions arising:
-
How can LA fit into a general evaluation framework as part of learning design?
-
Why do we need LA?
-
How do changes in the broader context of the evolution of e-learning impact our thinking about the scope of LA?
-
In what ways can quality be defined that are useful to LA?
-
How can LA fit within the growing forms of open education?
-
In what ways can LA be incorporated into learning design?
Learning analytics data items on digital textbooks
Tamura (2014) introduces historical perspective in introducing the significance of both learning analytics and digital textbooks within the contemporary domain of technology-enhanced learning and standards development. He proposes a set of data items to be collected in digital textbooks. The proposal is based on conventional LMS-based learning activity analytics and modern tablet PC-based learning. The latter has the advantage to collect more detailed data about learners with use of equipped sensors and logging of the manipulation of materials.
Key issues:
This paper is concerned with understanding the variety of data items that enables LA. If the full range of available data is not considered at the design stage of LA systems, then the promise of optimum interoperability will not be fulfilled.
Questions arising:
-
What are LA data items?
-
What standardisation work related to LA already exists?
-
Is the scope of current standardisation work on LA adequate?
-
How can metrics for LA be expressed?
Learning analytics: an enabler for dropout prediction
Tseng et al. (2014) address a key application of learning analytics, prediction of student learning performances, and risks of dropping out. They collected heterogeneous data from a middle school to develop a model for predicting dropout. This exploratory study concluded that dropout prediction using learning analytics may provide more precise information on identifying at-risk students and factors causing them to be at risk.
Key issues:
This paper explores predictive analytics, one of the sub-fields of application for LA, in order to understand the affordances of the use of data supporting this kind of analytics.
Questions arising:
-
How can we make sense of online learning behaviour?
-
Which dimensions of learning styles affect learners’ online behaviours?
-
Which online behaviours are affected by learning style?
-
What analytical methods can be utilised with learner data?
-
In what ways do group behaviours impact online learning?
-
In what ways do teaching styles impact learning behaviours?
-
Can LA successfully predict or identify students at risk of dropping out?
-
What kinds of data should be included in student records?