| dc.contributor.author | Gimshani, Kavindi | |
| dc.date.accessioned | 2026-03-11T06:32:01Z | |
| dc.date.available | 2026-03-11T06:32:01Z | |
| dc.date.issued | 2025 | |
| dc.identifier.citation | Gimshani, Kavindi (2025) Real-Time Emotion Recognition for Enhancing Student Engagement in LMS. Msc. Dissertation, Informatics Institute of Technology | en_US |
| dc.identifier.issn | 20230771 | |
| dc.identifier.uri | http://dlib.iit.ac.lk/xmlui/handle/123456789/2928 | |
| dc.description.abstract | Maintaining student engagement in online learning platforms presents a pressing challenge, especially as traditional emotional evaluation methods like questionnaires or manual observation fail to offer objective, real-time feedback. Emotional states significantly influence learning outcomes, making it essential to capture affective cues effectively in virtual settings. This study responds to that need by proposing a novel system that leverages Facial Expression Recognition (FER) within Learning Management Systems (LMS) to interpret students' emotions and monitor their engagement dynamically. The proposed system utilizes a deep learning model based on Convolutional Neural Networks (CNNs). Starting with exploratory baseline models, refined architecture was selected using a VGG-based transfer learning strategy and optimized through hyperparameter tuning. Emotion-specific features were extracted from facial images and classified into core affective states relevant to education namely happiness, surprise, neutral, and frustration. To enhance robustness, the model was trained with augmented data under various lighting and angle conditions. Key architectural decisions, including dropout layers to reduce overfitting and softmax activation for classification, were made based on iterative testing with benchmark FER datasets such as FER2013 and AffectNet. Performance evaluation was carried out using standard data science metrics. The system achieved a 96% classification accuracy with an F1-score of 0.94 and demonstrated high reliability across multiple emotion classes. The low false-positive rate and real-time processing capability of 1,000 video frames per second underline its practical viability for LMS integration. These results support the system’s utility in enabling instructors to adaptively respond to student engagement, making virtual classrooms more responsive and effective. Further evaluations are planned to extend its application across diverse learning environments and subject domains. | en_US |
| dc.language.iso | en | en_US |
| dc.subject | Facial Expression Recognition | en_US |
| dc.subject | Convolutional Neural Networks | en_US |
| dc.subject | Learning Management Systems | en_US |
| dc.title | Real-Time Emotion Recognition for Enhancing Student Engagement in LMS | en_US |
| dc.type | Thesis | en_US |