Multimodal Engagement and Sentiment Analytics in Health Science Education: A Learning Analytics Framework Integrating AI and Pedagogical Theory
DOI:
https://doi.org/10.19173/irrodl.v27i1.8868Keywords:
health science education, learning analytics, sentiment analysis, emotion detection, BERT, engagement typology, cognitive presence, multimodal AIAbstract
Online learning environments tend not to provide the social and pedagogical cues of physical classrooms, so evaluating student engagement and emotional states in real time becomes challenging. Current methods depend mainly upon facial expression recognition or textual sentiment analysis, constraining the depth and accuracy of behavioral interpretation. This research suggests a multimodal learning analytics framework that combines visual and textual data to infer learner emotions and engagement for improving the interpretability, responsiveness, and pedagogical value of learning analytics systems in digital education. Two datasets were created: (a) a facial expression dataset of 10,000 grayscale images annotated over five emotion categories and (b) an engagement dataset of 4,000 images annotated according to behavioral indicators. Concurrently, 1,667 learner feedback responses from massive open online courses were prepared for sentiment analysis. Convolutional neural networks (CNNs) were used for emotion and engagement classification, and a fine-tuned BERT (bidirectional encoder representations from transformers) model for sentiment analysis. A rule-based integration engine combined outputs to create multidimensional behavioural typologies. The CNN models reached >92% validation accuracy for both emotion detection and engagement detection tasks, whereas the BERT sentiment classifier achieved F1 = 0.87 and 88.1% accuracy. The multimodal integration procedure identified four unique learner behavior typologies (e.g., students who were cognitively engaged but visually disengaged). The framework offers an accurate, interpretable, and scalable real-time learning analytics solution. Compared with previous methods, it overcomes significant limitations and offers a useful resource for facilitating adaptive, data-based instruction interventions, especially in online and health science education.
References
Alruwais, N. M., & Zakariah, M. (2024). Student recognition and activity monitoring in e-classes using deep learning in higher education. IEEE Access, 12, 66110–66128. https://doi.org/10.1109/ACCESS.2024.3354981
AlZu’bi, S., Abu Zitar, R., Hawashin, B., Abu Shanab, S., Zraiqat, A., Mughaid, A., Almotairi, K. H., & Abualigah, L. (2022). A novel deep learning technique for detecting emotional impact in online education. Electronics, 11(18), Article 2964. https://doi.org/10.3390/electronics11182964
Bar, T., Dutta, D. K., Kumar, A., Tiwari, A., Maity, S., & Sau, S. (2023). A deep learning-based approach for students’ involvement assessment in an e-learning platform. In 2023 International Conference in Advances in Power, Signal, and Information Technology (APSIT) (pp. 426–431). IEEE. https://doi.org/10.1109/APSIT58554.2023.10201682
Bhardwaj, P., Gupta, P., Panwar, H., Siddiqui, M. K., Morales-Menendez, R., & Bhaik, A. (2021). Application of deep learning on student engagement in e-learning environments. Computers & Electrical Engineering, 93, Article 107277. https://doi.org/10.1016/j.compeleceng.2021.107277
Chelloug, S. A., Ashfaq, H., Alsuhibany, S. A., Shorfuzzaman, M., Alsufyani, A., & Jalal, A. (2023). Real objects understanding using 3D haptic virtual reality for e-learning education. Computers, Materials & Continua, 75(1), 1607–1624. https://doi.org/10.32604/cmc.2023.032245
Dewan, M. A. A., Lin, F., Wen, D., Murshed, M., & Uddin, Z. (2018). A deep learning approach to detecting engagement of online learners. In 2018 IEEE SmartWorld, Ubiquitous Intelligence & Computing, Advanced & Trusted Computing, Scalable Computing & Communications, Cloud & Big Data Computing, Internet of People and Smart City Innovation (SmartWorld/SCALCOM/UIC/ATC/CBDCom/IOP/SCI) (pp. 1895–1902). https://doi.org/10.1109/SmartWorld.2018.00318
Ezaldeen, H., Misra, R., Bisoy, S. K., Alatrash, R., & Priyadarshini, R. (2022). A hybrid e-learning recommendation integrating adaptive profiling and sentiment analysis. Journal of Web Semantics, 72, Article 100700. https://doi.org/10.1016/j.websem.2021.100700
Gambo, F., Wajiga, G. M., Shuib, L., Garba, E. J., Abdullahi, A. A., & Bisandu, D. B. (2022). Performance comparison of convolutional and multiclass neural network for learning style detection from facial images. EAI Endorsed Transactions on Scalable Information Systems, 9(35), Article e1. https://doi.org/10.4108/eai.20-10-2021.171549
Garrison, D. R., Anderson, T., & Archer, W. (2010). The first decade of the community of inquiry framework: A retrospective. The Internet and Higher Education, 13(1–2), 5–9. https://doi.org/10.1016/j.iheduc.2009.10.003
Jakkaladiki, S. P., Janečková, M., Krunčík, J., Malý, F., & Otčenášková, T. (2023). Deep learning-based education decision support system for student e-learning performance prediction. Scalable Computing: Practice and Experience, 24(3), 327–338. https://doi.org/10.12694/scpe.v24i3.2188
Khan, F. M., Iqbal, Z., Akhtar, M. S., & Khan, I. U. (2024). Machine learning-based students’ sentiment towards e-learning amid COVID-19 pandemic. The Asian Bulletin of Big Data Management, 4(1), 1–20. https://doi.org/10.62019/abbdm.v4i1.132
Kirsal Ever, Y., & Dimililer, K. (2018). The effectiveness of a new classification system in higher education as a new e-learning tool. Quality & Quantity, 52(Suppl 1), 573–582. https://doi.org/10.1007/s11135-017-0636-y
Liu, Y., Chen, W., & Zhang, H. (2024). Deep multimodal fusion for affective learning analytics: Integrating facial and textual data for engagement prediction. IEEE Transactions on Learning Technologies, 17(2), 214–229. https://doi.org/10.1109/TLT.2024.3352149
Mandia, S., Mitharwal, R., & Singh, K. (2024). Automatic student engagement measurement using machine learning techniques: A literature study of data and methods. Multimedia Tools and Applications, 83(16), 49641–49672. https://doi.org/10.1007/s11042-023-17534-9
Muniasamy, A., & Alasiry, A. (2020). Deep learning: The impact on future eLearning. International Journal of Emerging Technologies in Learning (iJET), 15(1), 188–199. https://doi.org/10.3991/ijet.v15i01.11435
Murshed, M., Dewan, M. A. A., Lin, F., & Wen, D. (2019). Engagement detection in e-learning environments using convolutional neural networks. In 2019 IEEE Intl Conf on Dependable, Autonomic and Secure Computing, Intl Conf on Pervasive Intelligence and Computing, Intl Conf on Cloud and Big Data Computing, Intl Conf on Cyber Science and Technology Congress (DASC/PiCom/CBDCom/CyberSciTech) (pp. 80-86). IEEE. https://doi.org/10.1109/DASC/PiCom/CBDCom/CyberSciTech.2019.00028
Ortigosa, A., Martín, J. M., & Carro, R. M. (2014). Sentiment analysis in Facebook and its application to e-learning. Computers in Human Behavior, 31, 527–541. https://doi.org/10.1016/j.chb.2013.05.024
Pathak, D., & Kashyap, R. (2023). Neural correlate-based e-learning validation and classification using convolutional and long short-term memory networks. Traitement du Signal, 40(4), 1457–1467. https://doi.org/10.18280/ts.400414
Picard, R. W. (2010). Affective computing: From laughter to IEEE. IEEE Transactions on Affective Computing, 1(1), 11–17. https://doi.org/10.1109/T-AFFC.2010.10
Salau, L., Hamada, M., Prasad, R., Hassan, M., Mahendran, A., & Watanobe, Y. (2022). State-of-the-art survey on deep learning-based recommender systems for e-learning. Applied Sciences, 12(23), Article 11996. https://doi.org/10.3390/app122311996
Sebbaq, H., & El Faddouli, N.-e. (2022). Fine-tuned BERT model for large scale and cognitive classification of MOOCs. International Review of Research in Open and Distributed Learning, 23(2), 170–190. https://doi.org/10.19173/irrodl.v23i2.6023
Selim, T., Elkabani, I., & Abdou, M. A. (2022). Students engagement level detection in online e-learning using hybrid EfficientNetB7 together with CN, LSTM, and Bi-LSTM. IEEE Access, 10, 99573–99583. https://doi.org/10.1109/ACCESS.2022.3206779
Sweller, J. (2019). Cognitive load theory and educational design: Recent developments. Educational Psychology Review, 31(2), 261–276. https://doi.org/10.1007/s10648-019-09465-5
Toti, D., Capuano, N., Campos, F., Dantas, M., Neves, F., & Caballé, S. (2021). Detection of student engagement in e-learning systems based on semantic analysis and machine learning. In Advances on P2P, Parallel, Grid, Cloud and Internet Computing: Proceedings of the 15th International Conference on P2P, Parallel, Grid, Cloud and Internet Computing (3PGCIC-2020) 15 (pp. 211–223). Springer International Publishing. https://doi.org/10.1007/978-3-030-61105-7_21
Yin, L., Zhang, X., & Luo, Z. (2023). Affective computing in online learning: Trends, challenges, and future directions. Computers in Human Behavior Reports, 11, Article 100297. https://doi.org/10.1016/j.chbr.2023.100297
Yu, J. H., & Chauhan, D. (2025). Trends in NLP for personalized learning: LDA and sentiment analysis insights. Education and Information Technologies, 30(4), 4307–4348. https://doi.org/10.1007/s10639-024-12988-2
Zhang, Y., Wang, C., & Lee, H. (2023). Text sentiment analysis using BERT for student feedback interpretation in online learning platforms. Education and Information Technologies, 28(5), 6327–6342. https://doi.org/10.1007/s10639-023-11478-z
Zimmerman, B. J. (2020). Self-regulated learning and academic achievement: An overview. Educational Psychologist, 55(3), 145–160. https://doi.org/10.1080/00461520.2020.1772025
Published
How to Cite
Issue
Section
License

This work is licensed under a Creative Commons Attribution 4.0 International License.
This work is licensed under a Creative Commons Attribution 4.0 International License. The copyright for all content published in IRRODL remains with the authors.
This copyright agreement and usage license ensure that the article is distributed as widely as possible and can be included in any scientific or scholarly archive.
You are free to
- Share — copy and redistribute the material in any medium or format
- Adapt — remix, transform, and build upon the material for any purpose, even commercially.
The licensor cannot revoke these freedoms as long as you follow the license terms below:
- Attribution — You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.
- No additional restrictions — You may not apply legal terms or technological measures that legally restrict others from doing anything the license permits.




