• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to previous page

Teaching assembly tasks based on semi-autonomous teleoperation for robots with vision and multi-fingered hands

Research Project

Project/Area Number 19K12170
Research Category

Grant-in-Aid for Scientific Research (C)

Allocation TypeMulti-year Fund
Section一般
Review Section Basic Section 61050:Intelligent robotics-related
Research InstitutionWakayama University

Principal Investigator

Ogawara Koichi  和歌山大学, システム工学部, 准教授 (70452810)

Project Period (FY) 2019-04-01 – 2022-03-31
Project Status Completed (Fiscal Year 2021)
Budget Amount *help
¥4,420,000 (Direct Cost: ¥3,400,000、Indirect Cost: ¥1,020,000)
Fiscal Year 2021: ¥1,040,000 (Direct Cost: ¥800,000、Indirect Cost: ¥240,000)
Fiscal Year 2020: ¥1,690,000 (Direct Cost: ¥1,300,000、Indirect Cost: ¥390,000)
Fiscal Year 2019: ¥1,690,000 (Direct Cost: ¥1,300,000、Indirect Cost: ¥390,000)
Keywords動作教示 / 遠隔操作 / 組み立て作業
Outline of Research at the Start

本研究課題では,視覚,多指ハンド,ロボットアームを搭載したロボットに対して機械などの正確な組み立て作業を簡便に教示することが可能な,半自律遠隔操作に基づく動作教示法を開発する.そのために,正確な物体操作を実現する多指ハンドと,視触覚センサによる3次元情報の拡張に基づき自律的に物体の把持と移動を実現する技術を開発し,これらを用いて動作教示に基づくロボットによる組み立て作業の自動化に取り組む.

Outline of Final Research Achievements

In order to develop a motion teaching method that can easily teach assembly operations to robots equipped with vision, multi-fingered hands, and robot arms, the following three research tasks were carried out in this study. (1) To realize accurate manipulation of an object by a multi-fingered hand, we developed a technique for manipulating a grasped object by a multi-fingered hand that actively utilizes deformation of elastic elements and a technique for presenting an arbitrary viewpoint image by integrating information from multiple sensors. (2) In order to realize grasping and moving of an object based on three-dimensional information from vision and tactile sensors, we developed techniques for recognizing rigid and non-rigid objects and for motion planning of the robot arm. (3) We developed a simultaneous grasping and object recognition technique to realize learning of teaching actions.

Academic Significance and Societal Importance of the Research Achievements

ロボットに搭載されたセンサから得られる情報は限られているため,物体の正確な操作を必要とする作業を遠隔操作で教示することは困難である.本研究では,物体認識に基づく経路探索や物体の安定把持などの機能を自律機能としてロボットに持たせることによって,教示者の負担を大幅に軽減することが可能な技術を開発した.本技術は,たとえば多品種少量生産に対応する多能工ロボットの動作教示などに応用できる.

Report

(4 results)
  • 2021 Annual Research Report   Final Research Report ( PDF )
  • 2020 Research-status Report
  • 2019 Research-status Report
  • Research Products

    (6 results)

All 2022 2021 2020

All Presentation (6 results) (of which Int'l Joint Research: 2 results)

  • [Presentation] Face alignment by learning from small real datasets and large synthetic datasets2022

    • Author(s)
      Haoqi Gao, Koichi Ogawara
    • Organizer
      2022 Asia Conference on Cloud Computing, Computer Vision and Image Processing (3CVIP 2022)
    • Related Report
      2021 Annual Research Report
    • Int'l Joint Research
  • [Presentation] ロボット指に搭載可能な波長が異なる複数のLEDを利用した小型近接覚センサの開発2022

    • Author(s)
      加藤 颯, 小川原 光一
    • Organizer
      日本機械学会関西支部 第97期定時総会講演会
    • Related Report
      2021 Annual Research Report
  • [Presentation] RGB-Dカメラと2台のロボットアームによる計測視点数を考慮した持ち替えに基づく全周3次元形状計測法2022

    • Author(s)
      曽我 幸慧, 小川原 光一
    • Organizer
      日本機械学会関西支部 第97期定時総会講演会
    • Related Report
      2021 Annual Research Report
  • [Presentation] 深層学習に基づく複数の長方形布の分離と形状推定2021

    • Author(s)
      曽根川 大輝, 小川原 光一
    • Organizer
      第22回計測自動制御学会システムインテグレーション部門講演会
    • Related Report
      2021 Annual Research Report
  • [Presentation] 対照学習を用いた潜在表現空間上での目標条件付き強化学習2021

    • Author(s)
      山田 貴哉, 小川原 光一
    • Organizer
      ロボティクス・メカトロニクス講演会2021
    • Related Report
      2021 Annual Research Report
  • [Presentation] Estimation of object class and orientation from multiple viewpoints and relative camera orientation constraints2020

    • Author(s)
      Koichi Ogawara, Keita Iseki
    • Organizer
      IEEE/RSJ 2020 International Conference on Intelligent Robots and Systems (IROS)
    • Related Report
      2020 Research-status Report
    • Int'l Joint Research

URL: 

Published: 2019-04-18   Modified: 2023-01-30  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi