Please use this identifier to cite or link to this item:
DC FieldValueLanguage
dc.contributor.authorPei-Ju Chungen_US
dc.identifier.citation[1] Kuan-Cheng Lin, Min-Chin Lin & Jason C. Hung, 'The Development of Facial Micro-Expression Recognition System,' in The 6th International Conference on. Frontier Computing - Theory, Technologies and Applications, 2017. [2] Craig, S.D., Graesser, A.C., Sullins, J.&Gholson, B., 'Affect and learning: an exploratory look into the role of affect in learning with AutoTutor,' Journal of Educational Media 29 (3), p. 241–250, 2004. [3] P. Ekman & W. V. Friesen, 'The facial action coding system: a technique for the measurement of facial movement,' Consulting Psychologists Press, San Francisco, 1978. [4] 唐大崙、張文瑜, '利用眼球追蹤法探索傳播研究,' 中華傳播學刊, pp. 165-211, 2007. [5] Tong Zhang, Mark Hasegawa-Johnson & Stephen Levinson, 'Children's emotion recognition in an intelligent tutoring scenario,' in Proceedings of 8th European Conference on Spoken Language Processing, Jeju Island, Korea, 2004. [6] R. W. Picard, 'Affective Computing,' MIT, Media Laboratory, 1995. [7] Jason Chi-Shun Hung., Kun-Hsiang Chiang., Yi-Hung Huang.&Kuan-Cheng Lin, 'Augmenting teacher-student interaction in digital learning through affective computing,' Multimedia Tools and Applications, pp. 1-26, 2016. [8] K. C. Lin, T.-C. Huang, J. C. Hung, N. Y. Yen & S. J. Chen, 'Facial Emotion Recognition towards Affective Computingbased Learning,' Library Hi Tech, vol. 31, no. 2, p. 294–307, 2013. [9] Graesser, A., Mcdaniel, B., Chipman, P., Witherspoon, A., D'Mello, S & Gholson, B., 'Detection of emotions during learning with AutoTutor,' in Cognitive Science Society, 2006. [10] B. McDaniel, S. D'Mello, B. King, P. Chipman, K. Tapp & A. Graesser, 'Facial Features for Affective State Detection in Learning Environments,' in Proc. 29th Ann. Meeting of the Cognitive Science Soc, 2007. [11] Joseph F. Grafsgaard , Joseph B. Wiggins , Kristy Elizabeth Boyer , Eric N. Wiebe & James C. Lester, 'Automatically Recognizing Facial Indicators of Frustration: A Learning-centric Analysis,' Proceedings of the 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, pp. 159-165, September 2013. [12] Bosch, N., D'Mello, S., Baker, R., Ocumpaugh, J., & Shute, V., 'Using video to automatically detect learner affect in computer-enabled classrooms,' ACM Transactions on Interactive Intelligent Systems, 6(2), p. 17.1–17.31., 2016. [13] Ma. Mercedes T. Rodrigo , Ryan S. Baker , Matthew C. Jadud , Anna Christine M. Amarra , Thomas Dy , Maria Beatriz V. Espejo-Lahoz , Sheryl Ann L. Lim , Sheila A.M.S. Pascua , Jessica O. Sugay & Emily S. Tabanao, 'Affective and behavioral predictors of novice programmer achievement,' in Proceedings of the 14th annual ACM SIGCSE conference on Innovation and technology in computer science education, Paris, France, 2009. [14] Guia, T.F.G., Sugay, J.O., Rodrigo, M.M.T., Macam, F.J.P., Dagami, M.M.C., Mitrovic, A., 'Transitions of affective states in an intelligent tutoring system,' Proceedings of the Philippine Computing Society, p. 31–35, 2011. [15] Mercedes, M., & Rodrigo, T, 'Dynamics of student cognitive-affective transitions during a mathematics game,' Simulation & Gaming, 42(1), p. 85–99, 2010. [16] D'Mello, S.K. & Graesser, A.C., 'AutoTutor and affective AutoTutor: Learning by talking with cognitively and emotionally intelligent computers that talk back,' ACM Transactions on Interactive Intelligent Systems 2(4), Article 23, 2012. [17] Andres, J.M.L., Rodrigo, M.M.T., 'The Incidence and persistence of affective states while playing Newton's playground,' in 7th IEEE International Conference on Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment, and Management, 2014. [18] 毛峽,薛麗麗, 人機情感交互, 北京: 科學出版社, 2011. [19] 潘奕安, '低解析度影像序列之自動化表情辨識系統,' 碩士論文,國立成功大學資訊工程學系, 2004. [20] Ira Cohen, Nicu Sebe, Ashutosh Garg, Lawrence S. Chen & Thomas S. Huang, 'Facial expression recognition from video sequences: temporal and static modeling,' Computer Vision and Image Understanding, vol. 91, no. 1-2, pp. 160-187, 2003. [21] H. Kobayashi & F. Hara, 'The recognition of basic facial expressions by neural network,' IEEE International Joint Conference on Neural Networks, pp. 460-466, 1991. [22] Hiroshi KOBAYASHI & Fumio HARA, 'Analysis of the Neural Network Recognition Characteristics of 6 Basic Facial Expressions,' JSME international journal. Ser. C, Dynamics, control, robotics, design and manufacturing, vol. 39, no. 2, pp. 323-331, 1996. [23] J. Chang & J. Chen, 'A facial expression recognition system using neural networks,' Proc. IEEE Int. Joint Conf. Neural Netw., vol. 5, pp. 113-120, 1999. [24] 楊家豪, '遠距教學之注意及反應階段情意目標評量系統,' 朝陽科技大學資訊科技研究所, 台中,台灣, 2009. [25] P. Ekman & W. V. Friesen, 'Nonverbal Behavior and Psychopathology,' The Psychology of Depression: Contemporary Theory and Research, pp. 203-232, 1974. [26] 姜振宇, 微表情 ─ 如何識別它人臉面真假?, 北京: 鳳凰出版社, 2011. [27] Z. Zabokrtsky, 'Feature engineering in Machine Learning,' Institute of Formal and Applied Linguistics,Charles University, Prague, 2015. [28] Isabelle Guyon & André Elisseeff, 'An introduction to variable and feature selection,' The Journal of Machine Learning Research, 2003. [29] B. L., Classification and Regression Trees, New York: Kluwer Academic Publishers, 1984. [30] J. R. Quinlan, 'Induction of Decision Trees,' Machine Learning, pp. 81-106, 1986. [31] T. K. Ho, 'Random Decision Forests,' in Proceedings of the 3rd International Conference on Document Analysis and Recognition, Montreal, 1995. [32] David E. Goldberg & John H. Holland., 'Genetic Algorithms and Machine Learning,' Machine Learning, pp. 95-99, 1988. [33] Corinna Cortes & Vladimir Vapnik, 'Support-Vector Networks,' Machine Learning, pp. 273-297, 1995. [34] R.P.L.DURGABAI, 'Feature Selection using ReliefF Algorithm,' Int. J. Adv. Res. Comput. Commun. Eng., vol. 3, no. 10, p. 8215–8218, 2014. [35] K. Kira & L.A. Rendell, 'A practical approach to feature selection,' in Proceedings 9th International Conference on Machine Learning, Aberdeen, Scotland, Morgan Kaufmann, Los Altos, CA, 1992. [36] Kononenko, I., Šimec, E., & Robnik-Šikonja, M., 'Overcoming the myopia of inductive learning algorithms with RELIEFF,' Applied Intelligence, pp. 39-55, 1997. [37] P. Lucey, J.F. Cohn, T. Kanade, J. Saragih, Z. Ambadar & I. Matthews, 'The Extended Cohn-Kanade Dataset (CK+): A Complete Dataset for Action Unit and Emotion-Specified Expression,' Proc. IEEE Conf. Computer Vision and Pattern Recognition Workshops, pp. 94-101, June 2010. [38] P. Ekman, Emotions Revealed, New York:: Henry Holt and Company, 2003. [39] Hart, S.G.& Staveland, T.E., 'Development of NASA-TLX (Task Load Index): results of empirical and theoretical research,' in Human Mental Workload, Amsterdam: North-Holland, Hancock PA&Meshkati N, 1988, pp. 139-178. [40] 'Dlib C++ Library,' [Online]. Available: [41] 'Facial landmarks with dlib, OpenCV, and Python,' [Online]. Available: [42] T. Fawcett, 'An Introduction to ROC Analysis,' Pattern Recognition Letters, vol. 27, no. 8, pp. 861-874, 2006. [43] 'ROC曲線 (Receiver Operating Characteristic Curve),' [Online]. Available: [44] Miyamoto, Y., Yoo, J., Levine, C. S., Park, J., Boylan, J. M., Sims, T., Markus, H. R. & Kitayama, S, 'Culture and Social Hierarchy: Self- and Other-Oriented Correlates of Socioeconomic Status Across Cultures,' Journal of Personality and Social Psychology, 2018.zh_TW
dc.description.abstract情緒是人類面臨各種事件反應出來的心理狀態。心理狀態與身體狀態息息相關,呈現於外就是各種生理信號。研究者藉由捕捉與分析生理信號來判斷人類的心理狀態。情緒的分類相當多元,從與生俱來,放諸四海皆準的基本情緒,到後天成長中從社會各種情境中習得高互動的複雜情緒,學者都有涉入研究。 在各種情境中,與學習相關的情緒更是受到重視。情緒會影響到學生學習,反之學習成效也會反應到學生的情緒。本論文聚焦於後者,探討學生在學習過程反應出來的情緒狀況,稱為學習情緒。研究者在探討學習情緒相關問題時都會從基本情緒出發,因為基本情緒是直覺容易定義的。艾克曼分析全球各地人類表達基本情緒時臉部肌肉的移動,定義動作單元並解釋其與基本情緒的聯結關係。 研究者透過動作單元編碼系統來了解基本情緒後,進而想要藉由動作單元來解析複雜的學習情緒。然而我們發現不同研究對於學習情緒會有不同的動作單元,因此,本論文決定找出學習情緒與動作單元的相關性。首先,建立「學習情緒人臉影像資料庫」與操作型定義,並依操作型定義標記學習情緒與動作單元,做為本論文訓練資料。接著,利用三種特徵選擇方法:嵌入式(決策樹)、封裝式(GA+SVM)與過濾式(ReliefF算法)找出學習情緒與動作單元之相關性。於二元分類下探討個別情緒與動作單元之相關性,研究結果得知三種特徵選擇方法的共同AU組合。與不同研究比較,單一動作單元雖然與學習情緒有顯著相關性,但動作單元組合能更準確辨識學習情緒。於多元分類下探討學習情緒辨識的準確率,研究結果得知,在相同準確率下,GA+SVM會擁有最小的特徵組合。決策樹抓出更多AU,推測可能有Overfitting。而ReliefF算法+SVM所產生的AU組合與其他兩者相差較大,因ReliefF算法不考慮特徵之間的相關性,僅計算特徵與目標類別間的統計相關性,不能有效的去除宂餘特徵。 為了能夠運用於實際教學現場,本論文建立動作單元辨識模型來達到學習情緒的即時辨識,以臉部特徵值作為隨機森林分類演算法的輸入向量來預測動作單元。透過學習情緒人臉影像資料庫與CK+資料庫相互訓練與測試後,研究結果顯示,本論文於建立學習情緒模型所使用之15個動作單元,大多數動作單元具有良好之泛化能力且模型具有良好的鑑別力,對於少數動作單元泛化能力較差。最後,利用動作單元辨識模型取得動作單元,並利用學習情緒辨識模型來辨識學習者於學習過程中所產生的學習情緒,達到即時辨識學習情緒的效果。zh_TW
dc.description.abstractEmotion is the psychological state that human are exposed to various events. The psychological state is closely related to the physical state and it is a variety of physiological signals. Researchers judge human mental states by capturing and analyzing physiological signals. The classification of emotions is quite diverse. From basic emotions that are inborn and universally applicable to the acquisition of highly interactive and complex emotions from various social situations in the acquired growth, scholars are involved in research. In various contexts, learning-related emotions are more valued. Emotions affect student learning, and learning effectiveness can also reflect student emotions. We focuses on the latter. Learning emotions is to explore the emotional state of students in the learning process. Researchers will start with basic emotions when discussing issues related to learning emotions, because basic emotions are easily defined by intuition. Ekman analyzes the movement of facial muscles when humans express basic emotions around the world as an action unit and explains its connection with basic emotions. After learning the basic emotions through the action unit coding system, the researchers wanted to analyze complex learning emotions through action units. However, we found that different studies have different action units for learning emotions. Therefore, we decided to find out the correlation between learning emotions and action units. First, we will establish a database named as 'Learning Emotion Image Database' and operational definition. Labeling learning emotions and action units according to the operational definition as the training data. Then, three feature selection methods are used: decision tree, GA+SVM and ReliefF algorithm to find out the relationship between action units and learning emotions. Under the binary classification, we research the correlation between individual emotions and action units. The results showed the common AU combination of the three feature selection methods. Compared with different studies, although single action units have significant correlation with learning emotions, AU combinations can more accurately identify learning emotions. Under the multiclass classification, we research the accuracy of learning emotion recognition. The results showed that with the same accuracy, GA+SVM will have the smallest AU combination. The decision tree grabs more AU and speculates that there may be overfitting. The AU combination generated by ReliefF algorithm is different from both of them. Because the ReliefF algorithm does not consider the correlation between features and features, only the statistical correlation between the features and target categories is calculated. It cannot effectively remove redundant features. In order to apply to the actual teaching scene, we establishes the action unit recognition model to achieve the instant recognition of learning emotion. The action unit recognition model which apply random forest classification algorithm uses feature values as input vectors to predict action units. After training and testing with Learning Emotion Image Database and CK+ database, the results show that 15 action units we used in the learning emotion model, most of them have good generalization ability and the model has good discrimination. A few action units have poor generalization ability. Finally, the action unit identification model is used to obtain the action unit, and the learning emotion recognition model is used to identify the learning emotion generated by the learner in the learning process, so as to instantly recognize learning emotion.en_US
dc.description.tableofcontents目次 摘要 i Abstract ii 目次 iv 表目次 vi 圖目次 viii 第一章 緒論 1 1.1 研究背景 1 1.2 研究動機與目的 2 第二章 文獻探討 3 2.1 情緒 3 2.1.1 臉部動作編碼系統 3 2.1.2 學習情緒 4 2.2 情意計算 6 2.2.1 情意計算方法 6 2.2.2 應用情意計算於數位學習 7 2.3 特徵工程 8 2.3.1 嵌入式 (Embedded) 方法 9 2.3.2 封裝式(wrapper)方法 10 2.3.3 過濾式 (filter) 方法 11 2.4 The Extended Cohn-Kanade Dataset(CK+)資料庫介紹 11 第三章 研究方法 13 3.1 學習情緒與動作單元相關性 13 3.1.1 學習情緒人臉影像資料庫之建立 13 3.1.2 學習情緒與動作單元之相關性 17 3.2 建立動作單元辨識模型 18 3.2.1 人臉特徵擷取 18 3.2.2 人臉特徵值運算 19 3.2.3 特徵區域化 20 3.2.4 動作單元辨識模型 20 3.3 建立學習情緒辨識模型 23 第四章 研究結果 24 4.1 學習情緒與動作單元之相關性 24 4.1.1 嵌入式(決策樹) 24 4.1.2 封裝式(基因演算法搭配支援向量機) 29 4.1.3 過濾式(ReliefF算法) 29 4.1.4 三種特徵選擇方法之比較 32 4.2 動作單元辨識模型之建立 35 4.2.1 隨機森林 35 4.2.2 基因演算法搭配隨機森林 36 4.3 學習情緒辨識模型之建立 37 4.3.1 嵌入式(決策樹) 37 4.3.2 封裝式(基因演算法搭配支援向量機) 38 4.3.3 過濾式(ReliefF算法) 38 4.3.4 三種特徵選擇方法之比較 40 4.4 討論 42 第五章 結論與建議 43 5.1 研究結果與討論 43 5.2 未來研究方向 44 參考文獻 45zh_TW
dc.subjectLearning emotionsen_US
dc.subjectAction Unitsen_US
dc.subjectFeature Selectionen_US
dc.titleApplying Facial Action Units and Feature Selection Methods to Develop the Learning Emotion Image Database and Recognition Modelen_US
dc.typethesis and dissertationen_US
item.openairetypethesis and dissertation-
item.fulltextwith fulltext-
Appears in Collections:資訊管理學系
Files in This Item:
File SizeFormat Existing users please Login
nchu-107-7105029004-1.pdf1.69 MBAdobe PDFThis file is only available in the university internal network    Request a copy
Show simple item record

Google ScholarTM


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.