2013 Fiscal Year Final Research Report
Emotion Recognition from Multimodal Information Based on Synesthesia
Project/Area Number |
24650083
|
Research Category |
Grant-in-Aid for Challenging Exploratory Research
|
Allocation Type | Single-year Grants |
Research Field |
Perception information processing/Intelligent robotics
|
Research Institution | Osaka University |
Principal Investigator |
NAGAI Yukie 大阪大学, 工学研究科, 特任准教授(常勤) (30571632)
|
Project Period (FY) |
2012-04-01 – 2014-03-31
|
Keywords | 認知発達ロボティクス / 知能ロボティクス / 認知発達 / 情動 / 共感覚 / 人―ロボットインタラクション / マルチモーダル |
Research Abstract |
Inspired by neuroscience and developmental studies, we have proposed a computational model for emotional development based on multimodal perceptual information. A synesthetic mechanism enables a robot first to detect invariant features among multiple modalities and then to extract emotion such as pleasure/unpleasure and moreover six basic emotions (i.e., happy, surprise, anger, etc.) by abstracting the features using a probabilistic neural network. Our hypothesis that among multiple modalities, tactile information leads to emotional development has been verified through our comparative experiments. These results yield new insights about neural mechanisms of emotional development and promising ideas for the design of computational models for emotion recognition.
|