Real time shape recognition by fusing range images and intensity images
Project/Area Number |
08650509
|
Research Category |
Grant-in-Aid for Scientific Research (C)
|
Allocation Type | Single-year Grants |
Section | 一般 |
Research Field |
計測・制御工学
|
Research Institution | CHUO UNIVERSITY |
Principal Investigator |
UMEDA Kazunori Chuo Univ., Faculty of Science and Engineering Lecturer, 理工学部, 講師 (10266273)
|
Project Period (FY) |
1996 – 1997
|
Project Status |
Completed (Fiscal Year 1997)
|
Budget Amount *help |
¥1,900,000 (Direct Cost: ¥1,900,000)
Fiscal Year 1997: ¥500,000 (Direct Cost: ¥500,000)
Fiscal Year 1996: ¥1,400,000 (Direct Cost: ¥1,400,000)
|
Keywords | Range Image / Intensity Image / Sensor fusion / Shape Recognition / Least Sguare Method / Robot Vision / Error Analysis |
Research Abstract |
This study proposes methods to fuse range images and intensity images. The fusion is thought to be an important technology for 3D shape recognition of an intelligent robot system. By the fusion, robust and practical 3D shape recognition methods can be constructed. Firstly, characteristics of each image is discussed. Planar and cylindrical regions are selected as the features extracted from a range image. This is because a range image is usually sparse and complex features are difficult to extract robustly from a range image, and additionally, these features are often included in real objects. For an intensity image, edge information is chosen.. Using the features, following methods of fusion are proposed and formulated : (1) extraction of planar and cylindrical regions from a range image as the preprocess of the fusion, (2) measurement of 3D edges by fusing a planar region and an edge, (3) update of parameters of a cylindrical region by fusing an edge. They are formulated using least square approach. The fusion methods are applied to the problem of distributed sensing of multiple robots. Each robot is assumed to have a range image sensor and/or an intensity image sensor. 3D features are measured applying the proposed fusion methods. And additionally, dead-reckoning of each robot is corrected using the fused results.
|
Report
(3 results)
Research Products
(15 results)