Project/Area Number |
12650249
|
Research Category |
Grant-in-Aid for Scientific Research (C)
|
Allocation Type | Single-year Grants |
Section | 一般 |
Research Field |
Intelligent mechanics/Mechanical systems
|
Research Institution | Saitama University |
Principal Investigator |
KUNO Yoshinori Saitama University, Faculty of Engineering, Professor, 工学部, 教授 (10252595)
|
Co-Investigator(Kenkyū-buntansha) |
SHIMADA Nobutaka Osaka University, Graduate School of Engineering Research Associate, 大学院・工学研究科, 助手 (10294034)
NAKAMURA Akio Saitama University, Faculty of Engineering, Research Associate, 工学部, 助手 (00334152)
|
Project Period (FY) |
2000 – 2001
|
Project Status |
Completed (Fiscal Year 2001)
|
Budget Amount *help |
¥3,500,000 (Direct Cost: ¥3,500,000)
Fiscal Year 2001: ¥2,000,000 (Direct Cost: ¥2,000,000)
Fiscal Year 2000: ¥1,500,000 (Direct Cost: ¥1,500,000)
|
Keywords | Computer vision / Human interface / Nonverbal communication / Gesture recognition / Gaze / Intention understanding / Motion analysis / Unintentional behavior / ヒューマンインタフェース / 無意識 / 仮想現実 / 追跡 |
Research Abstract |
Unintentional nonverbal behaviors play an important role in human-human communication. However, only intentional nonverbal behaviors are used in human interfaces, which can be considered as human-machine communication. Thus we propose to use unintentional nonverbal behaviors in addition to intentional behaviors to realize user-friendly human interfaces. In this project, we have investigated the framework that is required in using unintentional behaviors effectively and have developed two application systems using such behaviors to prove the usefulness of our approach. As to the framework, we have devised a human-machine interaction method through gestures and verbal communication to recover failures, which cannot be avoided in recognizing unintentional behaviors by computer vision. One of the application systems is an intelligent wheelchair that can change how to avoid an approaching pedestrian according to the observation results of his/her head movements. If his/her head often turns toward it, he/she can be expected to notice it. Otherwise, he/she may not notice it. In the latter case, the wheelchair steps aside to avoid collision. In the former, it keeps observing the pedestrian because he/she usually starts avoiding it when it comes close to him/her. He/she does not move the head to show his/her noticing the wheelchair. However, the behavior can give valuable information for the wheelchair to determine the avoidance method. The second is a browser system that enables us to find necessary information speedily. Multiple windows containing information appear from the bottom of the display, moving upward swiftly. We move our eyes to obtain the information in the windows. The speed of such eye movements controls the upward moving speed of the windows so that we can get information from the system as fast as possible. Although we do not intend to move the eyes to control the speed, the action helps to realize a user-friendly human interface.
|