From signals to knowledge: A conceptual model for multimodal learning analytics |
| |
Authors: | Daniele Di Mitri Jan Schneider Marcus Specht Hendrik Drachsler |
| |
Affiliation: | Welten Institute ‐ Research Centre for Learning, Teaching and Technology, Open University of Netherlands, The Netherlands |
| |
Abstract: | Multimodality in learning analytics and learning science is under the spotlight. The landscape of sensors and wearable trackers that can be used for learning support is evolving rapidly, as well as data collection and analysis methods. Multimodal data can now be collected and processed in real time at an unprecedented scale. With sensors, it is possible to capture observable events of the learning process such as learner's behaviour and the learning context. The learning process, however, consists also of latent attributes, such as the learner's cognitions or emotions. These attributes are unobservable to sensors and need to be elicited by human‐driven interpretations. We conducted a literature survey of experiments using multimodal data to frame the young research field of multimodal learning analytics. The survey explored the multimodal data used in related studies (the input space) and the learning theories selected (the hypothesis space). The survey led to the formulation of the Multimodal Learning Analytics Model whose main objectives are of (O1) mapping the use of multimodal data to enhance the feedback in a learning context; (O2) showing how to combine machine learning with multimodal data; and (O3) aligning the terminology used in the field of machine learning and learning science. |
| |
Keywords: | learning analytics machine learning multimodal data multimodality sensors social signal processing |
|
|