dc.description.abstract |
"In the evolving landscape of online education, maintaining and enhancing student engagement presents a significant challenge. The level of engagement, influenced by various factors such as facial expressions, head movements, and overall attentiveness, necessitates an approach for accurate detection and analysis. Traditional methods often fall short of accurately capturing the nature of engagement, leading to a gap in effective monitoring and support for students in virtual learning environments. Therefore, a critical need for an advanced solution to predict students is raised.
To address this challenge, a multi-model engagement recognition system was proposed detecting most crucial engagement indicators such as facial expressions, eye gaze and head pose. Since facial expression plays a major role in engagement detection, novel CNN architectures were developed to detect two types of emotions through facial ROIs. To address the issue of class imbalance in the complex emotion dataset, the complementary model focusing on basic emotion detection was also developed, ensuring a comprehensive approach to capturing the engagement level. Furthermore, XAI techniques were applied to the predictions for validation of prediction and to identify the most promising facial ROIs which indicate engagement.
The effectiveness of the developed CNN model was evaluated using a series of data science metrics, including accuracy, precision, recall, and F1-score, across a test dataset comprising diverse instances of student engagement. The results were promising, demonstrating the model's capability to distinguish between different levels of engagement with a high degree of accuracy. Specifically, the complex and basic emotion prediction models yielded promising results, achieving 81% and 81% accuracy respectively. Furthermore, the implementation of eye gaze and face angle estimation modules demonstrated positive outcomes during the prediction processes." |
en_US |