Human-Computer Interaction for Affective State Detection Using Facial Expression

Document Type : Research Paper

Authors

1 School of Information Engineering - Southwest University of Science and Technology

2 School of Information Engineering Southwest University of Science and Technology

3 School of Computer Science Zhejiang Normal University, P. R. China

4 Electric Power College, Inner Mongolia University of Technology, Inner Mongolia 010000, China

5 Kwame Nkrumah University of Science and Technology, Kumasi, Ghana

Abstract

If computers could understand and respond to clients' nonverbal communication, human-computer communication (HCI) would improve in a friendly and non-intrusive way. Looks and feelings are usually seen to have a link, according to research. Inactive improvements, like watching videos, are often used to acknowledge emotional states. This paper examines emotional state acknowledgement using dynamic improvements, such as client looks when they attempt electronic assignments, especially across common computer frameworks according to programmatic experience, proposing a structure joining face effective recognition (FER) method with various stages. An information collection investigation is introduced to collect data from regular users while they complete a set of tasks based on programming. A cutting-edge AI approach for look-based full-of-feeling state recognition with Euclidean distance-based component portrayal and a modified encoding for clients' self-detailed emotional states is proposed and implemented for the accomplishments of this work. The results confirm the suggested method’s effectiveness, efficiency, and robustness.

Keywords


Abdat, F., Maaoui, C., & Pruski, A. (2011, 16-18 Nov. 2011). Human-Computer Interaction Using Emotion Recognition from Facial Expression. Paper presented at the 2011 UKSim 5th European Symposium on Computer Modeling and Simulation.
Achour-Benallegue, A., Pelletier, J., Kaminski, G., & Kawabata, H. (2024). Facial icons as indexes of emotions and intentions. Frontiers in Psychology, 15, 1-13. doi:https://doi.org/10.3389/fpsyg.2024.1356237
Altameem, T., & Altameem, A. (2020). Facial expression recognition using human machine interaction and multi-modal visualization analysis for healthcare applications. Image and Vision Computing, 103, 1-19. doi:https://doi.org/10.1016/j.imavis.2020.104044
Annela, M. (2023). Human Computer Interaction. 1-4. doi:https://www.researchgate.net/publication/379693822
Chen, J., Dey, S., Wang, L., Bi, N., & Liu, P. (2024). Attention-Based Multi-Modal Multi-View Fusion Approach for Driver Facial Expression Recognition. IEEE Access, PP, 1-20. doi:https://doi.org/10.1109/ACCESS.2024.3462352
Chu, Z. (2023). Facial expression recognition for a seven-class small and medium-sized dataset based on transfer learning CNNs. Applied and Computational Engineering, 4, 1-6. doi:https://doi.org/10.54254/2755-2721/4/2023394
Dubey, A. K., and Vanita Jain. (2020). Automatic Facial Recognition Using VGG16 Based Transfer Learning Model. Journal of Information and Optimization Sciences, 1-9. doi:https://doi.org/10.1080/02522667.2020.1809126
Giroux, F., Léger, P.-M., Brieugne, D., Courtemanche, F., Bouvier, F., Chen, S. L., . . . Senecal, S. (2021). Guidelines for Collecting Automatic Facial Expression Detection Data Synchronized with a Dynamic Stimulus in Remote Moderated User Tests. In (pp. 1-13).
Govindaraju, D., & Thangam, D. (2024). Emotion Recognition in Human-Machine Interaction and a Review in Interpersonal Communication Perspective. In (pp. 1-16).
Gumelar, W. S., Wulandari, S. F., Lestari, T. S., & Ruswandi, R. (2024). The Correlation Between Teachers' Emotional Intelligence and Students' Learning Engagement in EFL Class. JEELS (Journal of English Education and Linguistics Studies), 11, 1-25. doi:https://doi.org/10.30762/jeels.v11i2.3377
Gursesli, M., Lombardi, S., Duradoni, M., Bocchi, L., Guazzini, A., & lanatà, A. (2024). Facial Emotion Recognition (FER) Through Custom Lightweight CNN Model: Performance Evaluation in Public Datasets. IEEE Access, PP, 1-17. doi:https://doi.org/10.1109/ACCESS.2024.3380847
Han, S., Guo, Y., Zhou, X., Huang, J., Shen, L., & Luo, Y. (2023). A Chinese Face Dataset with Dynamic Expressions and Diverse Ages Synthesized by Deep Learning. Scientific Data, 10, 1-9. doi:https://doi.org/10.1038/s41597-023-02701-2
Hossain, M., E-Shan, S., & Kabir, H. (2021). An Efficient Way to Recognize Faces Using Mean Embeddings.
Hussain, T., Hussain, D., Hussain, I., Alsalman, H., Hussain, S., Sajid, S., & Al-Hadhrami, S. (2022). Internet of Things with Deep Learning-Based Face Recognition Approach for Authentication in Control Medical Systems. Computational and Mathematical Methods in Medicine, 2022, 1-17. doi:https://doi.org/10.1155/2022/5137513
Juliandy, C., Ng, P. W., & Darwin. (2024). Modeling Face Detection Application Using Convolutional Neural Network and Face-API for Effective and Efficient Online Attendance Tracking. Jurnal Online Informatika, 9, 1-8. doi:https://doi.org/10.15575/join.v9i1.1203
Kessous, L., Castellano, G., & Caridakis, G. (2009). Multimodal Emotion Recognition in Speech-based Interaction Using Facial Expression, Body Gesture and Acoustic Analysis. Journal on Multimodal User Interfaces, 3, 1-16. doi:https://doi.org/10.1007/s12193-009-0025-5
Key, B., & Brown, D. (2024). Making sense of feelings. Neuroscience of Consciousness, 2024, 1-9. doi:https://doi.org/10.1093/nc/niae034
Khan, A. (2022). Facial Emotion Recognition Using Conventional Machine Learning and Deep Learning Methods: Current Achievements, Analysis and Remaining Challenges. Information, 13, 1-17. doi:https://doi.org/10.3390/info13060268
Mancuso, V., Borghesi, F., Bruni, F., Pedroli, E., & Cipresso, P. (2024). Mapping the landscape of research on 360-degree videos and images: a network and cluster analysis. Virtual Reality, 28, 1-19. doi:https://doi.org/10.1007/s10055-024-01002-2
Mozaffari, L., Brekke, M., Gajaruban, B., Purba, D., & Zhang, J. (2023). Facial Expression Recognition Using Deep Neural Network.
Nayak, S., Nagesh, B., Routray, A., & Sarma, M. (2021). A Human–Computer Interaction framework for emotion recognition through time-series thermal video sequences. Computers & Electrical Engineering, 93, 107280. doi:https://doi.org/10.1016/j.compeleceng.2021.107280
Praneesh, M. (2024). Visual Emotion Recognition Through Affective Computing. In (pp. 147-162): SpringerLink.
Rathore, D., & Gautam, P. (2024). UTILIZING MACHINE LEARNING TECHNIQUES TO IDENTIFY EMOTIONAL CORRELATES IN PHRASE ARTICULATION. ShodhKosh: Journal of Visual and Performing Arts, 5, 1-8. doi:https://doi.org/10.29121/shodhkosh.v5.i5.2024.2107
Saganowski, S., Komoszyńska, J., Behnke, M., Perz, B., Kunc, D., Klich, B., . . . Kazienko, P. (2022). Emognition dataset: emotion recognition with self-reports, facial expressions, and physiology using wearables. Scientific Data, 9, 1-12. doi:https://doi.org/10.1038/s41597-022-01262-0
Sneha, & Raza, S. (2024). Affective Computing for Health Management via Recommender Systems: Exploring Challenges and Opportunities. In (pp. 163-182).
Spaniol, M., Wehrle, S., Janz, A., Vogeley, K., & Grice, M. (2024). The influence of conversational context on lexical and prosodic aspects of backchannels and gaze behaviour. Paper presented at the Speech Prosody.
Sumi, K., & Sato, S. (2022). Experiences of Game-Based Learning and Reviewing History of the Experience Using Player's Emotions. Frontiers in Artificial Intelligence, 5, 1-10. doi:https://doi.org/10.3389/frai.2022.874106
Tipirneni, S., & Leal, S. (2023). Deciphering Facial Expressions: factors that affect emotion recognition. Journal of Student Research, 11, 1-10. doi:https://doi.org/10.47611/jsrhs.v11i1.2460
Yamashita, J., Takimoto, Y., Oishi, H., & Kumada, T. (2024). How do personality traits modulate real-world gaze behavior? Generated gaze data shows situation-dependent modulations. Frontiers in Psychology, 14, 1-19. doi:https://doi.org/10.3389/fpsyg.2023.1144048
Yan, J., Li, P., Du, C., Zhu, K., Zhou, X., Liu, Y., & Wei, J. (2024). Multimodal Emotion Recognition Based on Facial Expressions, Speech, and Body Gestures. Electronics, 13, 1-22. doi:https://doi.org/10.3390/electronics13183756