Abdat, F., Maaoui, C., & Pruski, A. (2011, 16-18 Nov. 2011). Human-Computer Interaction Using Emotion Recognition from Facial Expression. Paper presented at the 2011 UKSim 5th European Symposium on Computer Modeling and Simulation.
Altameem, T., & Altameem, A. (2020). Facial expression recognition using human machine interaction and multi-modal visualization analysis for healthcare applications.
Image and Vision Computing, 103, 1-19. doi:
https://doi.org/10.1016/j.imavis.2020.104044
Chen, J., Dey, S., Wang, L., Bi, N., & Liu, P. (2024). Attention-Based Multi-Modal Multi-View Fusion Approach for Driver Facial Expression Recognition.
IEEE Access, PP, 1-20. doi:
https://doi.org/10.1109/ACCESS.2024.3462352
Chu, Z. (2023). Facial expression recognition for a seven-class small and medium-sized dataset based on transfer learning CNNs.
Applied and Computational Engineering, 4, 1-6. doi:
https://doi.org/10.54254/2755-2721/4/2023394
Giroux, F., Léger, P.-M., Brieugne, D., Courtemanche, F., Bouvier, F., Chen, S. L., . . . Senecal, S. (2021). Guidelines for Collecting Automatic Facial Expression Detection Data Synchronized with a Dynamic Stimulus in Remote Moderated User Tests. In (pp. 1-13).
Govindaraju, D., & Thangam, D. (2024). Emotion Recognition in Human-Machine Interaction and a Review in Interpersonal Communication Perspective. In (pp. 1-16).
Gumelar, W. S., Wulandari, S. F., Lestari, T. S., & Ruswandi, R. (2024). The Correlation Between Teachers' Emotional Intelligence and Students' Learning Engagement in EFL Class.
JEELS (Journal of English Education and Linguistics Studies), 11, 1-25. doi:
https://doi.org/10.30762/jeels.v11i2.3377
Gursesli, M., Lombardi, S., Duradoni, M., Bocchi, L., Guazzini, A., & lanatà, A. (2024). Facial Emotion Recognition (FER) Through Custom Lightweight CNN Model: Performance Evaluation in Public Datasets.
IEEE Access, PP, 1-17. doi:
https://doi.org/10.1109/ACCESS.2024.3380847
Han, S., Guo, Y., Zhou, X., Huang, J., Shen, L., & Luo, Y. (2023). A Chinese Face Dataset with Dynamic Expressions and Diverse Ages Synthesized by Deep Learning.
Scientific Data, 10, 1-9. doi:
https://doi.org/10.1038/s41597-023-02701-2
Hossain, M., E-Shan, S., & Kabir, H. (2021). An Efficient Way to Recognize Faces Using Mean Embeddings.
Hussain, T., Hussain, D., Hussain, I., Alsalman, H., Hussain, S., Sajid, S., & Al-Hadhrami, S. (2022). Internet of Things with Deep Learning-Based Face Recognition Approach for Authentication in Control Medical Systems.
Computational and Mathematical Methods in Medicine, 2022, 1-17. doi:
https://doi.org/10.1155/2022/5137513
Juliandy, C., Ng, P. W., & Darwin. (2024). Modeling Face Detection Application Using Convolutional Neural Network and Face-API for Effective and Efficient Online Attendance Tracking.
Jurnal Online Informatika, 9, 1-8. doi:
https://doi.org/10.15575/join.v9i1.1203
Kessous, L., Castellano, G., & Caridakis, G. (2009). Multimodal Emotion Recognition in Speech-based Interaction Using Facial Expression, Body Gesture and Acoustic Analysis.
Journal on Multimodal User Interfaces, 3, 1-16. doi:
https://doi.org/10.1007/s12193-009-0025-5
Khan, A. (2022). Facial Emotion Recognition Using Conventional Machine Learning and Deep Learning Methods: Current Achievements, Analysis and Remaining Challenges.
Information, 13, 1-17. doi:
https://doi.org/10.3390/info13060268
Mancuso, V., Borghesi, F., Bruni, F., Pedroli, E., & Cipresso, P. (2024). Mapping the landscape of research on 360-degree videos and images: a network and cluster analysis.
Virtual Reality, 28, 1-19. doi:
https://doi.org/10.1007/s10055-024-01002-2
Mozaffari, L., Brekke, M., Gajaruban, B., Purba, D., & Zhang, J. (2023). Facial Expression Recognition Using Deep Neural Network.
Nayak, S., Nagesh, B., Routray, A., & Sarma, M. (2021). A Human–Computer Interaction framework for emotion recognition through time-series thermal video sequences.
Computers & Electrical Engineering, 93, 107280. doi:
https://doi.org/10.1016/j.compeleceng.2021.107280
Praneesh, M. (2024). Visual Emotion Recognition Through Affective Computing. In (pp. 147-162): SpringerLink.
Saganowski, S., Komoszyńska, J., Behnke, M., Perz, B., Kunc, D., Klich, B., . . . Kazienko, P. (2022). Emognition dataset: emotion recognition with self-reports, facial expressions, and physiology using wearables.
Scientific Data, 9, 1-12. doi:
https://doi.org/10.1038/s41597-022-01262-0
Sneha, & Raza, S. (2024). Affective Computing for Health Management via Recommender Systems: Exploring Challenges and Opportunities. In (pp. 163-182).
Spaniol, M., Wehrle, S., Janz, A., Vogeley, K., & Grice, M. (2024). The influence of conversational context on lexical and prosodic aspects of backchannels and gaze behaviour. Paper presented at the Speech Prosody.
Sumi, K., & Sato, S. (2022). Experiences of Game-Based Learning and Reviewing History of the Experience Using Player's Emotions.
Frontiers in Artificial Intelligence, 5, 1-10. doi:
https://doi.org/10.3389/frai.2022.874106
Yamashita, J., Takimoto, Y., Oishi, H., & Kumada, T. (2024). How do personality traits modulate real-world gaze behavior? Generated gaze data shows situation-dependent modulations.
Frontiers in Psychology, 14, 1-19. doi:
https://doi.org/10.3389/fpsyg.2023.1144048
Yan, J., Li, P., Du, C., Zhu, K., Zhou, X., Liu, Y., & Wei, J. (2024). Multimodal Emotion Recognition Based on Facial Expressions, Speech, and Body Gestures.
Electronics, 13, 1-22. doi:
https://doi.org/10.3390/electronics13183756