1993 Fiscal Year Final Research Report Summary
Real Time 3D Robot Vision System based on Local Correlation Methods
Project/Area Number |
04555055
|
Research Category |
Grant-in-Aid for Developmental Scientific Research (B)
|
Allocation Type | Single-year Grants |
Research Field |
機械力学・制御工学
|
Research Institution | The University of Tokyo |
Principal Investigator |
INOUE Hirochika The University of Tokyo, Engineering, Professor, 工学部, 教授 (50111464)
|
Co-Investigator(Kenkyū-buntansha) |
TACHIKAWA Tetsuya シャープ(株), 技術本部, 研究員
INABA Masayuki 東京大学, 工学部, 助教授 (50184726)
|
Project Period (FY) |
1992 – 1993
|
Keywords | local correlation method / robot vision / vision system / binocukar stereo vision / motion tracking / optical flow / visual feedback / visual servo |
Research Abstract |
We have succeeded to develop the real-time three dimensional robot vision systembased on local correlation image processing technique. The vision system provides the essential functions of robot vision ; real-time object tracking, motion detection and depth map generation. Although the local correlation method is heavy calculation, the vision system we have originally developed uses a special LSI (Motion Estimation Processor) chip which is designed for image compression. The system consists of multiple visual unit board which has a general processor and the special LSI.As the image is broadcasted into the multiple visual units, the system can perform to process the image in parallel. In each unit, it is possible to calculation motion detection and motion tracking based on the block matching method of local windows. The application of the vision system includes, three dimensional object tracking, even if the object turned, optical flow generation, stereo matching, and depthmap generation. The system is extended to accept color images by developing color decording board. The color information put the vision system robustness for veryfying the trackability of the local windows. We have applied the vision system to several robotic application, including autonomous mobile car navigation, real time visual servoing, vision based motion guidance of mobile robot, manipulators and legged robots.
|