1996 Fiscal Year Final Research Report Summary
Visual Navigation Technology for Autonomous Vehicle
Project/Area Number |
07555430
|
Research Category |
Grant-in-Aid for Scientific Research (B)
|
Allocation Type | Single-year Grants |
Section | 試験 |
Research Field |
計測・制御工学
|
Research Institution | Osaka University |
Principal Investigator |
YACHIDA Masahiko Osaka University, Faculty of Engineering Science, Professor, 基礎工学部, 教授 (20029531)
|
Co-Investigator(Kenkyū-buntansha) |
SUMI Kazuhiko Mitsubishi Electric Corp., Industry & System Laboratory, Researcher, 産業システム研究所・センシングシステム開発部, 研究者
OHSAWA Yukio Osaka University, Faculty of Engineering Science, Research Associate, 基礎工学部, 助手 (20273609)
YAGI Yasushi Osaka University, Faculty of Engineering Science, Associate Professor, 基礎工学部, 助教授 (60231643)
|
Project Period (FY) |
1995 – 1996
|
Keywords | Omnidirectional Vision / Multiple Image Sensing / Path Planning / Potential Field / Navigation / Autonomous Vehicle |
Research Abstract |
We have developed a new multiple visual sensing sensor (MISS), which combines with an omnidirectional image sensor COPIS (COnic Projection Image Sensor) and binocular vision, for navigating the robot and understanding interesting objects in an environment by integrating both sensory data. Since COPIS observes a 360 degree view around the robot, under the assumption of the known motion of the robot, an environmental map of an scene is generated by monitoring azimuth change in the image. We did several experiments in the outdoor environment. The precision of obtained environmental maps was sufficient for robot navigation in such environment. In this work, we usually have extracted vertical edges under the assumption of perfect ground plane motion of the robot. However, if the robot joggles due to unevenness in the ground plane, it is necessary to estimate the rolling and swaying motion and stabilize the image. The radial component of optical flow in omnidirectional image sensor has a periodic characteristic and is invariant to the ground plane motion of the robot. Therfore, By making use of these characteristics, our method can estimate the rolling and the swaying motions by fitting the radial and component of optical flow to a periodic functional model and by evaluating symmetry of circumferential component of optical flow. On the other hand, The mobile robot has binocular vision and obtains a sequence of stereo images from the environment. Although a view field of the binocular vision is limited by a visual angle of lens, the binocular vision is useful for understand spatial configuration of the environment. Therefore we integrate both merits and propose an efficient sensing system. The system has been evaluated on the prototype sensor in actual environment. Furthermore, we have proposed a real-time algorithm for path planning in a dynamically changing 3-D environment where the motion of the obstacles can not be predicted.
|
Research Products
(14 results)