Please use this identifier to cite or link to this item:
標題: 應用臉部動作單元和特徵選擇方法開發學習情緒影像資料庫和辨識模型
Applying Facial Action Units and Feature Selection Methods to Develop the Learning Emotion Image Database and Recognition Model
作者: 鍾沛儒
Pei-Ju Chung
關鍵字: 學習情緒;動作單元;特徵選擇;Learning emotions;Action Units;Feature Selection
引用: [1] Kuan-Cheng Lin, Min-Chin Lin & Jason C. Hung, 'The Development of Facial Micro-Expression Recognition System,' in The 6th International Conference on. Frontier Computing - Theory, Technologies and Applications, 2017. [2] Craig, S.D., Graesser, A.C., Sullins, J.&Gholson, B., 'Affect and learning: an exploratory look into the role of affect in learning with AutoTutor,' Journal of Educational Media 29 (3), p. 241–250, 2004. [3] P. Ekman & W. V. Friesen, 'The facial action coding system: a technique for the measurement of facial movement,' Consulting Psychologists Press, San Francisco, 1978. [4] 唐大崙、張文瑜, '利用眼球追蹤法探索傳播研究,' 中華傳播學刊, pp. 165-211, 2007. [5] Tong Zhang, Mark Hasegawa-Johnson & Stephen Levinson, 'Children's emotion recognition in an intelligent tutoring scenario,' in Proceedings of 8th European Conference on Spoken Language Processing, Jeju Island, Korea, 2004. [6] R. W. Picard, 'Affective Computing,' MIT, Media Laboratory, 1995. [7] Jason Chi-Shun Hung., Kun-Hsiang Chiang., Yi-Hung Huang.&Kuan-Cheng Lin, 'Augmenting teacher-student interaction in digital learning through affective computing,' Multimedia Tools and Applications, pp. 1-26, 2016. [8] K. C. Lin, T.-C. Huang, J. C. Hung, N. Y. Yen & S. J. Chen, 'Facial Emotion Recognition towards Affective Computingbased Learning,' Library Hi Tech, vol. 31, no. 2, p. 294–307, 2013. [9] Graesser, A., Mcdaniel, B., Chipman, P., Witherspoon, A., D'Mello, S & Gholson, B., 'Detection of emotions during learning with AutoTutor,' in Cognitive Science Society, 2006. [10] B. McDaniel, S. D'Mello, B. King, P. Chipman, K. Tapp & A. Graesser, 'Facial Features for Affective State Detection in Learning Environments,' in Proc. 29th Ann. Meeting of the Cognitive Science Soc, 2007. [11] Joseph F. Grafsgaard , Joseph B. Wiggins , Kristy Elizabeth Boyer , Eric N. Wiebe & James C. Lester, 'Automatically Recognizing Facial Indicators of Frustration: A Learning-centric Analysis,' Proceedings of the 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, pp. 159-165, September 2013. [12] Bosch, N., D'Mello, S., Baker, R., Ocumpaugh, J., & Shute, V., 'Using video to automatically detect learner affect in computer-enabled classrooms,' ACM Transactions on Interactive Intelligent Systems, 6(2), p. 17.1–17.31., 2016. [13] Ma. Mercedes T. Rodrigo , Ryan S. Baker , Matthew C. Jadud , Anna Christine M. Amarra , Thomas Dy , Maria Beatriz V. Espejo-Lahoz , Sheryl Ann L. Lim , Sheila A.M.S. Pascua , Jessica O. Sugay & Emily S. Tabanao, 'Affective and behavioral predictors of novice programmer achievement,' in Proceedings of the 14th annual ACM SIGCSE conference on Innovation and technology in computer science education, Paris, France, 2009. [14] Guia, T.F.G., Sugay, J.O., Rodrigo, M.M.T., Macam, F.J.P., Dagami, M.M.C., Mitrovic, A., 'Transitions of affective states in an intelligent tutoring system,' Proceedings of the Philippine Computing Society, p. 31–35, 2011. [15] Mercedes, M., & Rodrigo, T, 'Dynamics of student cognitive-affective transitions during a mathematics game,' Simulation & Gaming, 42(1), p. 85–99, 2010. [16] D'Mello, S.K. & Graesser, A.C., 'AutoTutor and affective AutoTutor: Learning by talking with cognitively and emotionally intelligent computers that talk back,' ACM Transactions on Interactive Intelligent Systems 2(4), Article 23, 2012. [17] Andres, J.M.L., Rodrigo, M.M.T., 'The Incidence and persistence of affective states while playing Newton's playground,' in 7th IEEE International Conference on Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment, and Management, 2014. [18] 毛峽,薛麗麗, 人機情感交互, 北京: 科學出版社, 2011. [19] 潘奕安, '低解析度影像序列之自動化表情辨識系統,' 碩士論文,國立成功大學資訊工程學系, 2004. [20] Ira Cohen, Nicu Sebe, Ashutosh Garg, Lawrence S. Chen & Thomas S. Huang, 'Facial expression recognition from video sequences: temporal and static modeling,' Computer Vision and Image Understanding, vol. 91, no. 1-2, pp. 160-187, 2003. [21] H. Kobayashi & F. Hara, 'The recognition of basic facial expressions by neural network,' IEEE International Joint Conference on Neural Networks, pp. 460-466, 1991. [22] Hiroshi KOBAYASHI & Fumio HARA, 'Analysis of the Neural Network Recognition Characteristics of 6 Basic Facial Expressions,' JSME international journal. Ser. C, Dynamics, control, robotics, design and manufacturing, vol. 39, no. 2, pp. 323-331, 1996. [23] J. Chang & J. Chen, 'A facial expression recognition system using neural networks,' Proc. IEEE Int. Joint Conf. Neural Netw., vol. 5, pp. 113-120, 1999. [24] 楊家豪, '遠距教學之注意及反應階段情意目標評量系統,' 朝陽科技大學資訊科技研究所, 台中,台灣, 2009. [25] P. Ekman & W. V. Friesen, 'Nonverbal Behavior and Psychopathology,' The Psychology of Depression: Contemporary Theory and Research, pp. 203-232, 1974. [26] 姜振宇, 微表情 ─ 如何識別它人臉面真假?, 北京: 鳳凰出版社, 2011. [27] Z. Zabokrtsky, 'Feature engineering in Machine Learning,' Institute of Formal and Applied Linguistics,Charles University, Prague, 2015. [28] Isabelle Guyon & André Elisseeff, 'An introduction to variable and feature selection,' The Journal of Machine Learning Research, 2003. [29] B. L., Classification and Regression Trees, New York: Kluwer Academic Publishers, 1984. [30] J. R. Quinlan, 'Induction of Decision Trees,' Machine Learning, pp. 81-106, 1986. [31] T. K. Ho, 'Random Decision Forests,' in Proceedings of the 3rd International Conference on Document Analysis and Recognition, Montreal, 1995. [32] David E. Goldberg & John H. Holland., 'Genetic Algorithms and Machine Learning,' Machine Learning, pp. 95-99, 1988. [33] Corinna Cortes & Vladimir Vapnik, 'Support-Vector Networks,' Machine Learning, pp. 273-297, 1995. [34] R.P.L.DURGABAI, 'Feature Selection using ReliefF Algorithm,' Int. J. Adv. Res. Comput. Commun. Eng., vol. 3, no. 10, p. 8215–8218, 2014. [35] K. Kira & L.A. Rendell, 'A practical approach to feature selection,' in Proceedings 9th International Conference on Machine Learning, Aberdeen, Scotland, Morgan Kaufmann, Los Altos, CA, 1992. [36] Kononenko, I., Šimec, E., & Robnik-Šikonja, M., 'Overcoming the myopia of inductive learning algorithms with RELIEFF,' Applied Intelligence, pp. 39-55, 1997. [37] P. Lucey, J.F. Cohn, T. Kanade, J. Saragih, Z. Ambadar & I. Matthews, 'The Extended Cohn-Kanade Dataset (CK+): A Complete Dataset for Action Unit and Emotion-Specified Expression,' Proc. IEEE Conf. Computer Vision and Pattern Recognition Workshops, pp. 94-101, June 2010. [38] P. Ekman, Emotions Revealed, New York:: Henry Holt and Company, 2003. [39] Hart, S.G.& Staveland, T.E., 'Development of NASA-TLX (Task Load Index): results of empirical and theoretical research,' in Human Mental Workload, Amsterdam: North-Holland, Hancock PA&Meshkati N, 1988, pp. 139-178. [40] 'Dlib C++ Library,' [Online]. Available: [41] 'Facial landmarks with dlib, OpenCV, and Python,' [Online]. Available: [42] T. Fawcett, 'An Introduction to ROC Analysis,' Pattern Recognition Letters, vol. 27, no. 8, pp. 861-874, 2006. [43] 'ROC曲線 (Receiver Operating Characteristic Curve),' [Online]. Available: [44] Miyamoto, Y., Yoo, J., Levine, C. S., Park, J., Boylan, J. M., Sims, T., Markus, H. R. & Kitayama, S, 'Culture and Social Hierarchy: Self- and Other-Oriented Correlates of Socioeconomic Status Across Cultures,' Journal of Personality and Social Psychology, 2018.




Emotion is the psychological state that human are exposed to various events. The psychological state is closely related to the physical state and it is a variety of physiological signals. Researchers judge human mental states by capturing and analyzing physiological signals. The classification of emotions is quite diverse. From basic emotions that are inborn and universally applicable to the acquisition of highly interactive and complex emotions from various social situations in the acquired growth, scholars are involved in research.

In various contexts, learning-related emotions are more valued. Emotions affect student learning, and learning effectiveness can also reflect student emotions. We focuses on the latter. Learning emotions is to explore the emotional state of students in the learning process. Researchers will start with basic emotions when discussing issues related to learning emotions, because basic emotions are easily defined by intuition. Ekman analyzes the movement of facial muscles when humans express basic emotions around the world as an action unit and explains its connection with basic emotions.

After learning the basic emotions through the action unit coding system, the researchers wanted to analyze complex learning emotions through action units. However, we found that different studies have different action units for learning emotions. Therefore, we decided to find out the correlation between learning emotions and action units. First, we will establish a database named as 'Learning Emotion Image Database' and operational definition. Labeling learning emotions and action units according to the operational definition as the training data. Then, three feature selection methods are used: decision tree, GA+SVM and ReliefF algorithm to find out the relationship between action units and learning emotions. Under the binary classification, we research the correlation between individual emotions and action units. The results showed the common AU combination of the three feature selection methods. Compared with different studies, although single action units have significant correlation with learning emotions, AU combinations can more accurately identify learning emotions. Under the multiclass classification, we research the accuracy of learning emotion recognition. The results showed that with the same accuracy, GA+SVM will have the smallest AU combination. The decision tree grabs more AU and speculates that there may be overfitting. The AU combination generated by ReliefF algorithm is different from both of them. Because the ReliefF algorithm does not consider the correlation between features and features, only the statistical correlation between the features and target categories is calculated. It cannot effectively remove redundant features.

In order to apply to the actual teaching scene, we establishes the action unit recognition model to achieve the instant recognition of learning emotion. The action unit recognition model which apply random forest classification algorithm uses feature values as input vectors to predict action units. After training and testing with Learning Emotion Image Database and CK+ database, the results show that 15 action units we used in the learning emotion model, most of them have good generalization ability and the model has good discrimination. A few action units have poor generalization ability. Finally, the action unit identification model is used to obtain the action unit, and the learning emotion recognition model is used to identify the learning emotion generated by the learner in the learning process, so as to instantly recognize learning emotion.
Rights: 不同意授權瀏覽/列印電子全文服務
Appears in Collections:資訊管理學系

Files in This Item:
File SizeFormat Existing users please Login
nchu-107-7105029004-1.pdf1.69 MBAdobe PDFThis file is only available in the university internal network    Request a copy
Show full item record

Google ScholarTM


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.