2006 Fiscal Year Final Research Report Summary
Emotional communication between human and machine using motion of joints of body
Project/Area Number |
17500140
|
Research Category |
Grant-in-Aid for Scientific Research (C)
|
Allocation Type | Single-year Grants |
Section | 一般 |
Research Field |
Sensitivity informatics/Soft computing
|
Research Institution | University of Toyama |
Principal Investigator |
ISHII Masahiro University of Toyama, Graduate school of science and engineering, Associate Professor (10272717)
|
Co-Investigator(Kenkyū-buntansha) |
SATO Makoto Tokyo Institute of Technology, Precision and Intelligence Laboratory, Professor (50114872)
KOIKE Yasuharu Tokyo Institute of Technology, Precision and intelligence Laboratory, Associate (10302978)
HASEGAWA Syoichi The University of Electro-Communications, Department of Mechanical Engineering and Intelligent Systems, Associate Professor (10323833)
MARUTA Hidenori Nagasaki University, Information Media Center, Research Associate (00363474)
|
Project Period (FY) |
2005 – 2006
|
Keywords | emotion / gait / biological motion |
Research Abstract |
The purpose of this research was to achieve emotional communication between humans and robots. Numerous attempts have been made by scholars to show that the expressions of feces and voices are effective in receiving and manifesting emotion between humans and robots. Instead of feces and voices, we use body movement as a cue to emotion. We have obtained following results. We have built a movie database for emotion study. A motion-capturing system was used to get human gait information. 12 sensors were put onto walker's principal joints. Actors were asked to express emotion while walking. The database includes gait information from 30 actors. We investigated human percept on emotion from gait display. The gait data were presented as biological motion. Human subjects were asked to answer perceived emotion from the display. The rate of correct answer was 40%. We constructed an emotion-recognition system. A method of feature extraction to discriminate emotions of human from a sensing data of human gait patterns was studied. We assume that the high dimensional biological motion data are generated by low dimensional features which components are statistically independent. The extracted feature is evaluated by a discriminated result of the given biological motion data which identified five types of categories, "Anger", "Grief", "Disgust", "Joy" and "Fear". We achieve 40% accuracy for 5-class of emotion discrimination with 3 actors' biological motion data.
|
Research Products
(7 results)
-
-
-
-
-
-
-
[Book] だまされる脳2006
Author(s)
日本VR学会VR心理学究会編
Total Pages
200
Publisher
講談社
Description
「研究成果報告書概要(和文)」より