Project/Area Number |
23K16925
|
Research Category |
Grant-in-Aid for Early-Career Scientists
|
Allocation Type | Multi-year Fund |
Review Section |
Basic Section 61020:Human interface and interaction-related
|
Research Institution | Japan Advanced Institute of Science and Technology |
Principal Investigator |
|
Project Period (FY) |
2023-04-01 – 2026-03-31
|
Project Status |
Granted (Fiscal Year 2023)
|
Budget Amount *help |
¥4,550,000 (Direct Cost: ¥3,500,000、Indirect Cost: ¥1,050,000)
Fiscal Year 2025: ¥1,170,000 (Direct Cost: ¥900,000、Indirect Cost: ¥270,000)
Fiscal Year 2024: ¥1,170,000 (Direct Cost: ¥900,000、Indirect Cost: ¥270,000)
Fiscal Year 2023: ¥2,210,000 (Direct Cost: ¥1,700,000、Indirect Cost: ¥510,000)
|
Keywords | Emotional expression / Cross culture / Facial expression / Affective computing / Perception / Image processing / Computer vision / Emotional expresssion / Image analysis / Faces and gestures / Interaction |
Outline of Research at the Start |
This research will analyse visual images of facial expressions and gestures to understand emotional expressions in different cultures. A framework will be developed to represent cultural interactions and interdependencies, exploring the definitions of emotion and emotional expression across cultures
|
Outline of Annual Research Achievements |
To understand the psychology of emotional expression in different cultures, we collected facial expression images from media in Japan and Thailand, created a cross-cultural facial expression dataset, and computationally demonstrated the differences in facial expressions of people from different cultures. In addition, we further investigated the state-of-the-art facial expression recognition systems on our cross-cultural data and found the bias of these systems toward certain emotions of both Thai and Japanese samples. To confirm the bias of facial expression recognition in humans, we also conducted a subjective evaluation of human subjects' perception of these cross-cultural facial expression images. For further use in deep learning-based emotional expression analysis, we also considered the generative models to synthesize facial expressions of emotion, as the number of images collected in our dataset is insufficient for deep learning-based methods. In addition to facial expression features, we have also explored the potential of using human gait features. As part of our international collaboration efforts, we have also established our international collaboration with the researchers in Thailand through several meetings, including direct visits to Mahidol University (Faculty of Engineering and Department of Psychiatry, Faculty of Medicine, Siriraj Hospital) and Thammasat University. In addition, we organized the Special Session on Next Generation of Affective Computing (NGAC) at IEEE TENCON2023 to expand our domestic connection to more countries in Southeast Asia.
|
Current Status of Research Progress |
Current Status of Research Progress
2: Research has progressed on the whole more than it was originally planned.
Reason
Research on cross-cultural analysis of emotional expressions, with a focus on facial expressions, has progressed well. We have published our work related to this research in 2 international journals, 3 international conference papers, and 2 presentations at the domestic workshop. In terms of international collaboration, we have established the relationship with the overseas researchers in Thailand, Vietnam, and Malaysia, with the potential to expand this collaboration in the future.
|
Strategy for Future Research Activity |
We plan to collect more data for deep learning-based analysis with our international partner in Thailand, and to expand our sample space to other countries in Southeast Asia, such as Vietnam or Malaysia. We are also looking for a way to extend our work to other types of features that represent emotional expression, such as gait or posture.
|