• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to project page

1997 Fiscal Year Final Research Report Summary

Tightly Coupled Sensor-Behavior Approaches for Real World Recognition

Research Project

Project/Area Number 07245105
Research Category

Grant-in-Aid for Scientific Research on Priority Areas

Allocation TypeSingle-year Grants
Research InstitutionOsaka University

Principal Investigator

SHIRAI Yoshiaki  Osaka University, Faculty of Engineering, Professor, 工学部, 教授 (50206273)

Co-Investigator(Kenkyū-buntansha) KAKIKURA Masayoshi  Tokyodenki University, Faculty of Engineering, Professor, 工学部, 教授 (90224344)
HASEGAWA Tsutomu  Kyusyu University, Faculty of Engineering, Professor, 工学部, 教授 (00243890)
MORI Hideo  Yamanashi University, Faculty of Engineering, Professor, 工学部, 教授 (40020383)
INOUE Hiromitsu  The University of Tokyo, Faculty of Engineering, Professor, 大学院・工学系研究科, 教授 (50111464)
Project Period (FY) 1995 – 1997
KeywordsEnvironment Recognition / Mobile Robot / Work Planning / Optimal Planning / Realtime Vision / Sensor Fusion / Learning
Research Abstract

Tightly coupled sensor-behavior is important for intelligent systems to behave in the real world. Such systems should deal with uncertainty of sensor information, lack of information of the environment, uncertainty of motion, and uncertainty of a priori knowledge. Tightly coupled sensor-behavior is a promising approach to this problem. The result of this research is the followings :
・ Interpretation of Sensor Information for Real World Understanding : We studied fusion of various visual information to recognize environments, and implemented the methods in J.Rob.Vision (developed in this research) to realize realtime processing. Considering the reliability of recognition results, we made the optimal plan generation method which minimize the cost of sensing and that of motions. Finally, we made experiments of understanding real environments by sensing and actions.
・ Sensing mechanism for Behaviors Adaptive to the Real World : In order to move in a changing environment such as roads, we int … More egrated sonar-based obstacle detection, sign pattern-based recognition of roads and cars, and sign pattern learning during motion of a robot, and then we developed a guide robot for a blind people.
・ Behavior Control Based on Environment Model and Sensory Information : By developing a three-finger manipulation system, we realized realtime detection of object posture by stereo vision and detection of grasping force on fingers. We proposed error avoidance by monitoring task states and verified by assembly experiments. Moreover, we realized a system which selects a suitable cloth in a pile of clothes by a recursive region split method, and picks it up by a hand with a special rotor.
・ Adaptive Functions in Perception-Action Systems : We proposes "behavior network" which represents and controls concurrent process of selecting sensor-behavior and of predicting the environment based on the memory, and studied how to represent adaptive behaviors of a robot. We made a four legged robot with vision on J.Rob.Leg(developed in this research). By experiments of the adaptive behaviors in complex environments, we verified the performance of the behavior network. Less

  • Research Products

    (14 results)

All Other

All Publications (14 results)

  • [Publications] J.Miura and Y.Shirai: "Vision and Motion Plannning for a Mobile Robot underUncertainty"Int.J.of Robotica Research. 16-6. 806-825 (1997)

    • Description
      「研究成果報告書概要(和文)」より
  • [Publications] 岡田,白井,三浦,久野: "オプティカルフローと距離情報に基づく動物体追跡"電子情報通信学会論文誌. J80-D-II-6. 1530-1538 (1997)

    • Description
      「研究成果報告書概要(和文)」より
  • [Publications] 小谷信司、清弘智明、森英雄: "視覚障害者のための歩行ガイドロボット"映像情報メディア学会誌. 51-6. 878-885 (1997)

    • Description
      「研究成果報告書概要(和文)」より
  • [Publications] K.Hamajima and M.Kakikura: "Planninng Strategy for Task of Unifolding Clothes,-Total schene and isolation of oneclotrh-"3rd Asian Conference on Robotics and Its Application. 33-40 (1997)

    • Description
      「研究成果報告書概要(和文)」より
  • [Publications] Nobuhiro Okada,Tadashi Nagata,and Tsutomu Hasegawa: "A Reliable Parts-Picking System with an Active and Muliti-Sensor Visual System"ROBOTICA. 15-6. 693-700 (1997)

    • Description
      「研究成果報告書概要(和文)」より
  • [Publications] 岡、稲葉、井上: "自律ロボットの情報システム記述のための並列情報処理モデル"日本ロボット学会誌. 15-6. 878-885 (1997)

    • Description
      「研究成果報告書概要(和文)」より
  • [Publications] 松本、稲葉、井上: "視野画像列を利用した経路表現に基づくナビゲーション"日本ロボット学会誌. 15-2. 236-242 (1997)

    • Description
      「研究成果報告書概要(和文)」より
  • [Publications] J. Miura and Y. Shirai: "Vision and Motion Planning for a Mobile Robot under Uncertainty"Int. J. of Robotics Research. 16-6. 806-825 (1997)

    • Description
      「研究成果報告書概要(欧文)」より
  • [Publications] R. Okada, V. Shirai, J. Miura and Y. Kuno: "Object Tracking Based on Optical Flow and Depth"IEICE Trans.. D-II, J80-D-II-6. 1530-1538 (1997)

    • Description
      「研究成果報告書概要(欧文)」より
  • [Publications] S. Kotani, N. Kiyohiro and H. Mori: "Development of the Robotic Travel Aid for the Visually Impaired"Journal of the Institute of Image Information and Television Engineers. 51-6. 878-885 (1997)

    • Description
      「研究成果報告書概要(欧文)」より
  • [Publications] K. Hamajima and M. Kakikura: "Planning Strategy for Task of Unfolding Clothese, - Total Scheme and Isolation of One Clothes -"3rd Asian Conference on Robotics and Its Application. 33-40 (1997)

    • Description
      「研究成果報告書概要(欧文)」より
  • [Publications] N. Okada, T. Nagata, and T. Hasegawa: "A Reliable Parts-Picking System with an Active and Multi-Sensor Visual System"ROBOTICA. 15-6. 693-700 (1997)

    • Description
      「研究成果報告書概要(欧文)」より
  • [Publications] T. Oka, M. Inaba, and H. Inoue: "A Parallel Process Model for Describing Autonomous Robot Brain"Journal of Robotics Society of Japan. 16-6. 878-885 (1997)

    • Description
      「研究成果報告書概要(欧文)」より
  • [Publications] Y. Matsumoto, M. Inaba, and H. Inoue: "Visual Navigation Based on View Sequenced Route Representation"Journal of Robotics Society of Japan. 15-2. 236-242 (1997)

    • Description
      「研究成果報告書概要(欧文)」より

URL: 

Published: 2001-10-23  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi