Project/Area Number |
11650273
|
Research Category |
Grant-in-Aid for Scientific Research (C)
|
Allocation Type | Single-year Grants |
Section | 一般 |
Research Field |
Intelligent mechanics/Mechanical systems
|
Research Institution | Hosei University |
Principal Investigator |
KOBAYASHI Hisato Hosei University, Faculty of Engineering, Professor, 工学部, 教授 (30114820)
|
Co-Investigator(Kenkyū-buntansha) |
NAKAMURA Hideo Hosei University, Faculty of Engineering, Assistant, 工学部, 助手 (10061201)
|
Project Period (FY) |
1999 – 2000
|
Project Status |
Completed (Fiscal Year 2000)
|
Budget Amount *help |
¥3,100,000 (Direct Cost: ¥3,100,000)
Fiscal Year 2000: ¥800,000 (Direct Cost: ¥800,000)
Fiscal Year 1999: ¥2,300,000 (Direct Cost: ¥2,300,000)
|
Keywords | Tele-Operation / Intention Detection / Verbal Communication / Master-Slave / Communication Delay / 遠隔操作 / 意図 / 言語表現 / 動作 / 通信遅れ |
Research Abstract |
This study tried to build a human friendly interface for tele-robots. The situation considered here is as follows. Non-Japanese control robots located in Japan from overseas for giving cares to Japanese elderly. In such case, the operators have complete different cultural background from Japanese, thus quite user-friendly interfaces are absolutely necessary to realize such tele-caring. This study took two approaches : 1) Verbal Communication : Simple verbal sentences may have no problems to interpret. Thus, we tried to control the remote robot by simple daily verbal communication. 2) Physical Motion : Physical motion itself has some meanings, thus we tried to retrieve some meaning from the operators' motion. In the first approach, we treat nouns in the sentence as the most important items. In place of saying daily conversation, operators indicate the objects(nouns) in the screen by computer mouse, which the operator planned to say in his/her conversation. The developed interface rebuilds several appropriate candidates that might be robot-operation commands of operators' intention. By choosing one of them, the desired task will be executed. This study also developed a practical way to manipulate the object indicated by the remote operators. The crucial points are to know the real location of the object and the way to handle it. A machine vision technique and data-base technique resolved these problems. In the second approach, we succeeded to identify the basic intention of operators when they move their arms. Namely we could distinguish three different intention included in the arm motion : touch, hit or push. This approach is quite robust against delays in communication channels.
|