Investigadores del grupo de investigación GHIA de la Universidad Autónoma de Madrid y un investigador de Teachers College Columbia University, Nueva York, han colaborado en un artículo sobre cómo la integración de biosensores y analítica multimodal del aprendizaje (MmLA) puede transformar la educación en línea, proporcionando información más precisa sobre el comportamiento y la implicación de los estudiantes.
La investigación analiza estudios clave que utilizan señales fisiológicas como ritmo cardíaco, actividad cerebral y seguimiento ocular, junto con datos de interacción tradicionales y autoevaluaciones, para comprender mejor los estados cognitivos y el nivel de atención de los estudiantes.
Referencia:
Becerra, A., Cobos, R., & Lang, C. (2025). Enhancing online learning by integrating biosensors and multimodal learning analytics for detecting and predicting student behaviour: a review. Behaviour & Information Technology, 1–26. https://doi.org/10.1080/0144929X.2025.2562322
ABSTRACT
In modern online learning, understanding and predicting student behaviour is crucial for enhancing engagement and optimising educational outcomes. This systematic review explores the integration of biosensors and Multimodal Learning Analytics (MmLA) to analyze and predict student behaviour during computer-based learning sessions. We examine key challenges, including emotion and attention detection, behavioural analysis, experimental design, and demographic considerations in data collection. Our study highlights the growing role of physiological signals, such as heart rate, brain activity, and eye-tracking, combined with traditional interaction data and self-reports to gain deeper insights into cognitive states and engagement levels. We synthesise findings from 54 key studies, analyzing commonly used methodologies such as advanced machine learning algorithms and multimodal data pre-processing techniques. The review identifies current research trends, limitations, and emerging directions in the field, emphasising the transformative potential of biosensor-driven adaptive learning systems. Our findings suggest that integrating multimodal data can facilitate personalised learning experiences, real-time feedback, and intelligent educational interventions, ultimately advancing toward a more customised and adaptive online learning experience.
