Research on Real-time Visual Information Processing
Project/Area Number |
04044197
|
Research Category |
Grant-in-Aid for Overseas Scientific Survey.
|
Allocation Type | Single-year Grants |
Research Institution | Osaka University |
Principal Investigator |
YACHIDA Masahiko Faculty of Engineering Science Osaka University, Professor, 基礎工学部, 教授 (20029531)
|
Co-Investigator(Kenkyū-buntansha) |
YANG HyunSeu 韓国先端科学技術大学, 計算機工学科, 準教授
YAGI Yasushi Faculty of Engineering Science Osaka University, Research Associate, 基礎工学部, 助手 (60231643)
HYUN Seung Yang Korea Advanced Institute of Science and Technology, Associate Professor
|
Project Period (FY) |
1992
|
Project Status |
Completed (Fiscal Year 1992)
|
Keywords | Mobile Robot / Image Understanding / Visual Navigation / Real Time / Omnidirectional Image Sensor / Stereo Camera |
Research Abstract |
In this research, we built the method for navigating the robot and understanding interesting objects in an environment by using the omnidirectional image sensor (COPIS) and binocular vision. The robot can always estimate its own location and motion precisely because COPIS observes a 360 degree view around the robot. Based on the assumption of constant linear motions of the robot and objects, the objects moving along collision paths are detected by monitoring azimuth changes. The mobile robot has binocular vision and obtains a sequence of stereo images from the environment. We extend the principle of trinocular vision to establish correspondences between a sequence of binocular images and build the 3-D wireframe model of the environment. From this wireframe model, we recognize surfaces and spatial relationships between them and understand spatial configuration of the environment. We are planning to combine both omnidirectional image sensor COPIS and binocular vision.
|
Report
(1 results)
Research Products
(11 results)