2023 Fiscal Year Final Research Report
Building Computational Framework Toward Functional Spectrum Analysis of Multimodal Nonverbal Behaviors
Project/Area Number |
21K12011
|
Research Category |
Grant-in-Aid for Scientific Research (C)
|
Allocation Type | Multi-year Fund |
Section | 一般 |
Review Section |
Basic Section 61030:Intelligent informatics-related
|
Research Institution | Yokohama National University |
Principal Investigator |
Otsuka Kazuhiro 横浜国立大学, 大学院工学研究院, 准教授 (30396197)
|
Project Period (FY) |
2021-04-01 – 2024-03-31
|
Keywords | 対話 / 非言語コミュニケーション / マルチモーダル / 社会的信号処理 / 機械学習 |
Outline of Final Research Achievements |
Although nonverbal behavior plays important roles in human conversations, it has been difficult to analyze them quantitatively due to their ambiguity and diversity. To analyze such nonverbal behaviors quantitatively, this research proposed a new concept of "nonverbal functional spectrum," representing the distribution of perceived intensity of nonverbal behavior's functions, and constructed a unified computational framework. Focusing on head movement, facial expression, eye gaze, and Aizuchi as non-verbal modalities, we constructed machine-learning models that automatically estimate the occurrence of each function for each modality from observed information, such as the interlocutor's facial and body movements and speech signals. We also developed a method for predicting and explaining the interlocutor's subjective impressions based on the estimated functional spectrum.
|
Free Research Field |
知能情報学
|
Academic Significance and Societal Importance of the Research Achievements |
本研究では、人のコミュニケーションにおいて重要な役割を果たす、複数の非言語モダリティを介して生じる多数の機能を統一的に表現・分析・認識することが可能な計算基盤を構築した。この計算基盤は、人の繊細かつ複雑で、極めて多様性に富んだ感情や意図の表出・交換の過程を定量的に解明することに貢献すると期待される。さらに将来的には、実社会において、人の気持ちに寄り添い、人を支援し、協業することが可能な人工社会知能(Artificial Social Intelligence)を構築するための基盤技術として活用されると考えられる。
|