2002 Fiscal Year Final Research Report Summary
Environment Understanding using Robot Body
Project/Area Number |
12308016
|
Research Category |
Grant-in-Aid for Scientific Research (A)
|
Allocation Type | Single-year Grants |
Section | 一般 |
Research Field |
Intelligent informatics
|
Research Institution | Wakayama University (2002) Kyoto University (2000-2001) |
Principal Investigator |
WADA Toshikazu Wakayama University, Faculty of Systems Engineering, Professor, システム工学部, 教授 (00231035)
|
Co-Investigator(Kenkyū-buntansha) |
KAMEDA Yoshinari Kyoto University, Academic Center for Computing and Media Studies, Research Associate, 学術情報メディアセンター, 助手 (70283637)
NAKAMRA Takayuki Wakayama University, Faculty of Systems Engineering, Associate Professor, システム工学部, 助教授 (50291969)
WU Haiyuan Wakayama University, Faculty of Systems Engineering, Associate Professor, システム工学部, 助教授 (70283695)
|
Project Period (FY) |
2000 – 2002
|
Keywords | Color Target Detection / Nearest Neighbor Classifier / Camera Calibration / Homography from Circle / Environmental Map Generation / Discriminating Obstacles and Occluding Objects / Correlation between intended action and observed action / Behavior Learning |
Research Abstract |
In this research, we investigated the map generation problem for mobile robots. For the map generation, we made the closed loop consists of robot action and visual feedback verifying its movement. This loop is affected by the interaction between the robot body and its surrounding environment. Based on this idea, we can generate the environmental map, I.e., spatial description of the objects interacting with the robot body. This interaction is detected by finding the inconsistency between the intended action and observed movement. For this research, we first built up specially designed RC cars and its controller connected to the host PC, which also processes the images observed by external vision sensor. Next, in order to tracked the LED attached to the RC cars by the external vision sensor, we proposed a robust color target detection method and developed a prototype system. This system is already used in other institutes. For estimating real motion of the robot on the floor from the observed images, we proposed robot-body guided camera calibration method. This method automatically estimates the camera parameters from the elliptical trajectory on the image plane corresponding to the circular robot motion drawn by keeping a constant steering angle and low speed. This method is especially effective in the case of wide observing area. Based on the calibrated result, real robot movement can be estimated. Using these method, we developed a map generation system, by detecting the inconsistency between intended action and observed movement.
|