2016 Fiscal Year Final Research Report
Mental state estimation method from conflict between text and tone of the voice
Project/Area Number |
26330313
|
Research Category |
Grant-in-Aid for Scientific Research (C)
|
Allocation Type | Multi-year Fund |
Section | 一般 |
Research Field |
Kansei informatics
|
Research Institution | Hiroshima City University |
Principal Investigator |
MERA Kazuya 広島市立大学, 情報科学研究科, 助教 (50285425)
|
Research Collaborator |
Tang Ba Nhat
TAKAHASHI Takumi
UEMURA Joji
SHINMASU Yuta
TABUCHI Yuma
SUGIHARA Minoru
TANI Yuki
MURATA Yui
OKADA Atsushi
|
Project Period (FY) |
2014-04-01 – 2017-03-31
|
Keywords | 感情推定 / 音響分析 / マルチモーダル |
Outline of Final Research Achievements |
We developed emotion estimation methods from user's utterance based on acoustic features and text information. And, we constructed real-time emotion estimation system which can estimate each emotion from three types of modalities (voice, text, facial expression). At the same time, we analyzed the tendency of expressed emotions at complicate mental states like irony. Furthermore, we constructed emotional voice database for machine learning to develop our emotion estimation method using acoustic features.
|
Free Research Field |
感情情報処理,自然言語処理,音響分析,自然言語対話システム
|