| Project/Area Number |
23K11271
|
| Research Category |
Grant-in-Aid for Scientific Research (C)
|
| Allocation Type | Multi-year Fund |
| Section | 一般 |
| Review Section |
Basic Section 61050:Intelligent robotics-related
|
| Research Institution | Kyoto University |
Principal Investigator |
Brscic Drazen 京都大学, 情報学研究科, 准教授 (50605011)
|
| Project Period (FY) |
2023-04-01 – 2026-03-31
|
| Project Status |
Granted (Fiscal Year 2024)
|
| Budget Amount *help |
¥4,680,000 (Direct Cost: ¥3,600,000、Indirect Cost: ¥1,080,000)
Fiscal Year 2025: ¥1,430,000 (Direct Cost: ¥1,100,000、Indirect Cost: ¥330,000)
Fiscal Year 2024: ¥1,300,000 (Direct Cost: ¥1,000,000、Indirect Cost: ¥300,000)
Fiscal Year 2023: ¥1,950,000 (Direct Cost: ¥1,500,000、Indirect Cost: ¥450,000)
|
| Keywords | Smell / Human-robot interaction / Nonverbal communication / huamn-robot interaction / smell |
| Outline of Research at the Start |
The project aims to study the robot's use of smell during the interaction of humans. The plan is elucidate the appropriate ways to integrate smell within the interaction and the
|
| Outline of Annual Research Achievements |
In FY2024, we significantly advanced our research on olfactory modalities in human-robot interaction. The main focus was on implementing and evaluating smell detection and reproduction functions in a teleoperated avatar robot. The system enabled operators to detect ambient smells from a remote environment and later reproduce selected smells during interaction, creating a shared olfactory context with co-present partners. Through a controlled user study using a dessert-ranking task, we demonstrated that these shared olfactory experiences enhance the perceived quality, fairness, and engagement of remote communication. These findings were published as a full paper at HRI 2025, the top conference in the field.
In parallel, the PI conducted a one-year research stay at Yale University (7/2023-6/2024). During this period, the PI collaborated with multiple faculty members on projects related to human-robot interaction, extending FY2023 results in new directions. These included studies on robot nudging through physical motion, robots offering timely help in collaborative tasks, and the use of small talk to improve user comfort and engagement. This international collaboration led to several manuscripts, currently under review The PI also initiated a collaborative effort to author a textbook on human-robot interaction, further broadening the impact of the project.
|
| Current Status of Research Progress |
Current Status of Research Progress
1: Research has progressed more than it was originally planned.
Reason
FY2024 planned goal was to investigate the integration of smell into robot dialogue, focusing on factors like timing, repetition, and gestures. This objective was largely achieved through the development and evaluation of a teleoperated avatar with smell detection and reproduction, presented in a full paper at HRI 2025. The study examined how olfactory cues affect interaction quality, offering insights relevant to the planned modelling of delivery factors.
In addition, the PI's collaborative work during a research stay at Yale produced further contributions extending beyond the initial scope, with several related papers under review. While some aspects of the original modelling and comparative evaluation are ongoing, overall progress has exceeded the original expectations.
|
| Strategy for Future Research Activity |
In the next phase, we will study how robots can use smells in a perfume-like manner to support interaction by creating a pleasant atmosphere or influencing the user’s mood. This aligns with our original plan to explore the hedonic meaning of smell - how certain scents can convey affect or emotional tone beyond their source. We will begin with exploratory user studies to understand the impact of such usage and work toward enabling mood expression via smell. The project is progressing largely as planned, with no major changes.
|