Leveraging Eye Movement Features for Enhanced Emotion Recognition Accuracy.
Emotion recognition using eye movement data is a rapidly developing field with significant applications in affective computing and human-computer interaction. This study explores the potential of eye movement features alone for classifying emotions, leveraging the Deep Generalized Canonical Correlation Analysis with Attention Mechanism (DGCCA-AM) framework. Our experiments were conducted on the SEED-IV dataset, comprising eye movement recordings with 31 features from 15 subjects across three sessions, totaling approximately 37,600 samples. We evaluated the model in subject-dependent (intra-subject) as well as subject-independent (inter-subject) manner across all the three sessions. We obtained a high classification accuracy of 99.92% for distinguishing four emotions namely happy, sad, fear and neutral, in the third session; demonstrating the strong discriminative power of eye movement features in emotion classification. In inter-subject evaluation, we considered leave-one-subject-out strategy and evaluated the model's performance on all the three sessions of unseen subjects. Under this setting, the proposed approach provided an average accuracy of 63.14%, indicating challenges in generalization across different individuals. Further, we also analyzed the contribution of each of the three feature groups' subnetwork (namely pupil, event-statistics, and fixation/saccade/dispersion group) to the overall performance and found that pupil features are the most contributing feature groups, without which the model's accuracy drops by about 6%. Overall, these findings suggest that eye movement data alone is highly effective for within-subject emotion recognition but poses challenges in cross-subject generalization. Our study reinforces the importance of subject-specific modeling while opening new avenues for improving cross-subject adaptability in emotion recognition systems.Clinical relevance- Accurate emotion recognition is crucial in mental health assessment and human-computer interaction. This study leverages eye movement data and the DGCCA-AM framework to improve emotion classification without relying on EEG signals. The findings could aid clinicians in detecting emotional states for mental health diagnosis, enhancing interventions for conditions like anxiety, depression and affective disorders.