Project/Area Number |
11650272
|
Research Category |
Grant-in-Aid for Scientific Research (C)
|
Allocation Type | Single-year Grants |
Section | 一般 |
Research Field |
Intelligent mechanics/Mechanical systems
|
Research Institution | CHUO UNIVERSITY |
Principal Investigator |
SAKANE Shigeyuki Chuo University, Faculty of Scientific and Engineering, Professor, 理工学部, 教授 (10276694)
|
Project Period (FY) |
1999 – 2001
|
Project Status |
Completed (Fiscal Year 2001)
|
Budget Amount *help |
¥3,600,000 (Direct Cost: ¥3,600,000)
Fiscal Year 2001: ¥800,000 (Direct Cost: ¥800,000)
Fiscal Year 2000: ¥1,100,000 (Direct Cost: ¥1,100,000)
Fiscal Year 1999: ¥1,700,000 (Direct Cost: ¥1,700,000)
|
Keywords | human-robot interface / augmented reality / hand pointer / projector / hand tracking / annotation / visual tracking / occlusion / Augmented Reality / ディジタルデスク / 指さしポインタ / ジェスチャ認識 / 作業支援 / Annotation / トラッキングビジョン / ロボット作業教示 / ヒューマンロボットインタフェース / プロジェクター |
Research Abstract |
We have developed a human-robot interface system, PARTNER, that takes into account the flexibility of the Augmented Reality approach. The prototype consists of a projector subsystem for information display and a real-time tracking vision subsystem to recognize the human's action. In this project, we have developed two new functions, Interactive Hand Pointer and Adaptive Annotation Function. The Interactive Hand Pointer is used for selecting objects or positions in the environment via the operator's hand gestures. The system visually tracks the operator's pointing hand and projects a mark at the indicated position using an LCD projector. Since the mark can be observed directly in the real work space without monitor displays or HMDs, correction of the indicated position by moving the hand is very easy for the operator. The Adaptive Annotation, Function has been developed for teaching and assisting the human's task. Based on a state transition diagram, the system generates annotations adaptively to the situation by monitoring not only changes of the object's geometry but also the operator's action required for achieving the task. Experimental results of an unfolding task of a portable OHP device demonstrate the usefulness of the proposed system. Moreover, we have developed a view-based visual tracking system not only to cope with change of the appearance of the template in a 3D environment but also to cope with occlusion which is a problem in cooperatively handling objects between humans and robots. We proposed tessellated template method" to detect occlusion based on an evaluation of correlation errors in the tessellated template. Experiments of hand-over action between human and robot demonstrate usefulness of the visual tracking system.
|