2004 Fiscal Year Final Research Report Summary
Research of Emotion and Intention Understanding Based on Human Action Analysis and Its Application to Man-machine Communication
Project/Area Number |
14208037
|
Research Category |
Grant-in-Aid for Scientific Research (A)
|
Allocation Type | Single-year Grants |
Section | 一般 |
Research Field |
情報システム学(含情報図書館学)
|
Research Institution | Kobe University (2003-2004) Ryukoku University (2002) |
Principal Investigator |
ARIKI Yasuo Kobe University, Research Center for Urban Safety and Security, Professor, 都市安全研究センター, 教授 (10135519)
|
Co-Investigator(Kenkyū-buntansha) |
UEHARA Kuniaki Kobe University, Graduate School of Science & Technology, Professor, 自然科学研究科, 教授 (60160206)
KUMANO Masahito Ryukoku University, Faculty of Science and Technology, Assistant, 理工学部, 実験助手 (50319498)
|
Project Period (FY) |
2002 – 2004
|
Keywords | Data mining / Image recognition / Speech recognition / Emotion / Communication / Action analysis / Intention understanding / Motion capture |
Research Abstract |
In order to realize symbiotic machines with human beings, we studied a communication method focusing on body action habits and the inside emotion, based on data mining, image recognition, speech recognition and emotion information processing. 1 Studies on rule discovery from motion data and quality analysis of emotion (Uehara) It is difficult to retrieve the motion data unless the human action is organized or structured in terms of their contents when the large amount of motion data is stored like digital archives. To solve this problem, motion data is presented as motion locus in the three-dimensional space, and the motion speed and the posture are analyzed. Through this analysis, we compared the motion skills and discovered the features of high degree of skills. Using the action data captured by the motion capture system Eva, we extracted the intrinsic features of the respective motion and the distinguished features compared with other motions. 2 Intention understanding by integrating human action, emotion, speech and image recognition (Ariki, Kumano) It is possible to analyze intentions in terms of how it is expressed with action, speech, image and emotion through the studies of action, speech, image and emotion integration. From this point, we especially analyzed the intention based on the action and emotion. For example, when a user asks by speaking who is he and by pointing his face displayed on the large screen in the room like a classroom, the system integrates the speech recognition, pointing action and face recognition. We also analyzed the emotion to estimate the intention.
|
Research Products
(39 results)