TY - GEN
T1 - Affective State and Pain Estimation Through Facial Emotion Analysis
AU - Asaju, Christine Bukola
AU - Vadapalli, Hima
N1 - Publisher Copyright:
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2026.
PY - 2026
Y1 - 2026
N2 - Emotional state analysis is essential for understanding human emotions and their influence on health and well-being. Conventional methods rely on static models, often failing to account for the dynamic nature of human emotions and their context-dependent characteristics. This study proposes a method for estimating a patient’s emotional state using their facial expressions by utilising a CNN-BiLSTM cascade to classify facial expressions in real-time. This study uses the Extended Denver Intensity of Spontaneous Facial Action (DISFA+) and Extended Cohn-Kanade (CK+) datasets for experimentation. The emotion estimation model reported an accuracy of 85% on a sample size of 2,425 (DISFA+) annotated with seven basic emotions and further reported an 80% accuracy on CK+ dataset annotated with eight emotions. Estimated emotions are further mapped into emotional state estimates (comfortable vs uncomfortable categories), utilizing the mapping present in the literature as a way to monitor and interpret patients’ emotional states during online consultations. The study additionally evaluated the models using the UNBC McMaster pain datasets. DISFA+ model classified 18% of unlabelled samples as anger, 70% as disgust, and 10% as sadness. In contrast, the CK+ model classified 50% as anger, 40% as disgust, and 5% as sadness. This research highlights the strong correlation between facial expressions of anger and disgust in individuals experiencing pain. The study contributes to the field of affective computing in healthcare by improving the assessment of emotional states and pain.
AB - Emotional state analysis is essential for understanding human emotions and their influence on health and well-being. Conventional methods rely on static models, often failing to account for the dynamic nature of human emotions and their context-dependent characteristics. This study proposes a method for estimating a patient’s emotional state using their facial expressions by utilising a CNN-BiLSTM cascade to classify facial expressions in real-time. This study uses the Extended Denver Intensity of Spontaneous Facial Action (DISFA+) and Extended Cohn-Kanade (CK+) datasets for experimentation. The emotion estimation model reported an accuracy of 85% on a sample size of 2,425 (DISFA+) annotated with seven basic emotions and further reported an 80% accuracy on CK+ dataset annotated with eight emotions. Estimated emotions are further mapped into emotional state estimates (comfortable vs uncomfortable categories), utilizing the mapping present in the literature as a way to monitor and interpret patients’ emotional states during online consultations. The study additionally evaluated the models using the UNBC McMaster pain datasets. DISFA+ model classified 18% of unlabelled samples as anger, 70% as disgust, and 10% as sadness. In contrast, the CK+ model classified 50% as anger, 40% as disgust, and 5% as sadness. This research highlights the strong correlation between facial expressions of anger and disgust in individuals experiencing pain. The study contributes to the field of affective computing in healthcare by improving the assessment of emotional states and pain.
KW - Affective Computing
KW - Affective State Estimation
KW - Emotion Analysis
KW - Healthcare
KW - Pain Estimation
UR - https://www.scopus.com/pages/publications/105017223133
U2 - 10.1007/978-3-032-00652-3_16
DO - 10.1007/978-3-032-00652-3_16
M3 - Conference contribution
AN - SCOPUS:105017223133
SN - 9783032006516
T3 - Lecture Notes in Computer Science
SP - 213
EP - 226
BT - Artificial Intelligence in Healthcare - 2nd International Conference, AIiH 2025, Proceedings
A2 - Cafolla, Daniele
A2 - Rittman, Timothy
A2 - Ni, Hao
PB - Springer Science and Business Media Deutschland GmbH
T2 - 2nd International Conference on Artificial Intelligence on Healthcare, AIiH 2025
Y2 - 8 September 2025 through 10 September 2025
ER -