• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to project page

1993 Fiscal Year Final Research Report Summary

Active Recognition of Environment by Omini-Directional Vision

Research Project

Project/Area Number 03452168
Research Category

Grant-in-Aid for General Scientific Research (B)

Allocation TypeSingle-year Grants
Research Field 情報工学
Research InstitutionOsaka University

Principal Investigator

TSUJI Saburo  Osaka Univ., Faculty of Engineering Science, Professor, 基礎工学部, 教授 (60029527)

Co-Investigator(Kenkyū-buntansha) IMAI Masakazu  Nara Graduate School of Science and Technology, Associate Professor, 助教授 (60193653)
YAMADA Seiji  Osaka University, Institute of Industrial Science, Assistant Professor, 産業科学研究所, 講師 (50220380)
ISHIGURO Hiroshi  Osaka Univ., Faculty of Engineering Science, Research Associate, 基礎工学部, 助手 (10232282)
XU Gang  Osaka Univ., Faculty of Engineering Science, Assistant Professor, 基礎工学部, 講師 (90226374)
Project Period (FY) 1991 – 1993
Keywordscomputer vision / omni-directional vision / stereo / motion vision / autonomous robot / planning / world model / artificial intelligence
Research Abstract

The objective of research is to develop an omni-directional vision and its active utilization for modeling the environment of an autonomous robot.
(1) Omni-diectional stereo
We can acquire a omni-directional view from an image sequence taken while the camera is swiveled, however the it does not contain range information. We develop an omni-directional stereo from two omni-directional views sampled at two vertical slits in the imagery taken while the camera moves along a circular path.
(2) Environment map from omni-directional stereo
A robot plans the next observation point from a coarse environment map yielded from the omni-directional stereo and iterates plan-move-observe. By fusing the coarse environment maps at different point, a more precise model is obtained.
(3) Environment map from active observation
Since the base line of the omni-directional stereo is short, the range estimate is not precise. Thus, two omni-views at different points provide more precise range information, but estimations of both the distance between the two points and the robot rotation are needed. We develop a new active vision method, which guides the robot so as to keep the disparity of two feature points at 360 degree. Thus, the robot rotation is zero, and we can precisely estimate the environment geometry.
(4) Qualitative map
A qualitative environment map is useful for the robot navigation. Our robot autonomously plan the environment observation and takes route-panorama views and omni-directional views. By fusing these data, the robot can build the qualitative environment map. We verify the method by experiments in real indoor environments with many complex objects.

  • Research Products

    (16 results)

All Other

All Publications (16 results)

  • [Publications] 石黒浩 ほか: "全方位視野の距離情報獲得" 電子情報通信学会論文誌. J74-DII. 500-508 (1991)

    • Description
      「研究成果報告書概要(和文)」より
  • [Publications] 辻三郎: "環境のパノラマ表現" 電子情報通信学会誌. 74. 354-359 (1991)

    • Description
      「研究成果報告書概要(和文)」より
  • [Publications] 石黒浩 ほか: "能動的視覚表現を用いた環境構造の復元" 日本ロボット学会誌. 9. 541-550 (1991)

    • Description
      「研究成果報告書概要(和文)」より
  • [Publications] Hiroshi Ishiguro: "Omni-directional stereo" IEEE Transaction. PAM1-14. 257-262 (1992)

    • Description
      「研究成果報告書概要(和文)」より
  • [Publications] 植田健治 ほか: "パノラマ表現を用いた観測点の位置決めにおける一手法" 電子情報通信学会論文誌. J75-D-II. 1809-1817 (1992)

    • Description
      「研究成果報告書概要(和文)」より
  • [Publications] 李仕剛 ほか: "パノラマ表現内の3次元物体による経路シーンの記述" 電子情報通信学会論文誌. J76-D-II. 2177-2184 (1993)

    • Description
      「研究成果報告書概要(和文)」より
  • [Publications] Ishiguro, H., Yamamoto, M.and Tsuji, S.: "Omni-directional stereo" IEEE Trans.PAMI-14,2. 257-262 (1992)

    • Description
      「研究成果報告書概要(欧文)」より
  • [Publications] Zheng, J.Y.and Tsuji, S.: "Panoramic representation for route recognition by a mobile robot" Int.J.Computer Vision. 9,1. 55-76 (1992)

    • Description
      「研究成果報告書概要(欧文)」より
  • [Publications] Ishiguro, H., Yamamoto, M.and Tsuji, S.: "Acquiring precise range estimation from camera motion" Proc.IEEE Conf.Robotics & Automation. 2300-2305 (1991)

    • Description
      「研究成果報告書概要(欧文)」より
  • [Publications] Barth, M., Ishiguro, H.and Tsuji, S.: "Computationally inexpensive egomotion determination for a mobile robot using an active camera" Proc.IEEE Conf.Robotics & Automation. 2792-2305 (1991)

    • Description
      「研究成果報告書概要(欧文)」より
  • [Publications] Stelmaszyk, P., Ishiguro, M.and Tsuji, S.: "Mobile robot navigation by an active of the vision system" Proc.12th Int.Joint Conf.Artificial Intelligence. 1241-1253 (1991)

    • Description
      「研究成果報告書概要(欧文)」より
  • [Publications] Barth, M., Ishiguro, H.and Tsuji, S.: "Determining robot egomotion from motion parallax observed by an active camera" Proc.12th Int.Joint Conf.Artificial Intelligence. 1247-1253 (1991)

    • Description
      「研究成果報告書概要(欧文)」より
  • [Publications] Li, S., Miyawaki, I, Ishiguro, H.and Tsuji, S.: "Finding of 3D structure by an active-vision-based mobile robot" Proc.IEEE Conf.Robotics & Automation. 1812-1817 (1992)

    • Description
      「研究成果報告書概要(欧文)」より
  • [Publications] Li, S and Tsuji, S.: "Finding landmarks auto-nomously along a route" Proc.11th Int.Conf.Pattern Recognition. Vol.A. pp.316-319 (1992)

    • Description
      「研究成果報告書概要(欧文)」より
  • [Publications] Ishiguro, H., Ueda, K.and Tsuji, S.: "Omnidirectional visual information for navigation a mobile robot" Proc.IEEE Conf.Robotics & Automation. Vol.1. 799-804 (1993)

    • Description
      「研究成果報告書概要(欧文)」より
  • [Publications] Tsuji, S.and Li, S.: "Making cognitive map of outdoor environment" Proc.13th Int, Joint Conf.Artificial Intelligence. 1632-1638 (1993)

    • Description
      「研究成果報告書概要(欧文)」より

URL: 

Published: 1995-03-27  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi