• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to project page

2020 Fiscal Year Final Research Report

Deep learning for action

Research Project

  • PDF
Project/Area Number 17K18420
Research Category

Grant-in-Aid for Young Scientists (B)

Allocation TypeMulti-year Fund
Research Field Intelligent informatics
Intelligent robotics
Research InstitutionNational Institute of Advanced Industrial Science and Technology

Principal Investigator

Yoshiyasu Yusuke  国立研究開発法人産業技術総合研究所, 情報・人間工学領域, 主任研究員 (10712234)

Project Period (FY) 2017-04-01 – 2021-03-31
Keywords深層学習 / アクション
Outline of Final Research Achievements

The aim of this project is to develop a recognition technology for producing actions based on deep learning which has been mainly used in image recognition. We divide this research project into (1) Navigation technology using language and vision, (2) Recognition of environment and for generating robot trajectories and (3) 6-DoF object pose recognition and its application to robot object manipulation. Several different types of inputs, such as 2D image, bird-eye-view image, 3D image, were used to recognize objects and environments and then to produce actions. In addition, we devised a learning-based navigation technique by combining visual and semantic information. As a result of comparisons with the previous approaches, we showed that the proposed techniques outperform previous techniques in terms of speed and accuracy. Finally, we implemented a 6-DoF object pose recognition techniques based on deep learning on robotic arm systems and achieved object grasping.

Free Research Field

知能ロボティクス、人工知能、コンピュータビジョン

Academic Significance and Societal Importance of the Research Achievements

本研究では、深層学習を用いた物体認識技術とそれに基づく物体操作やナビゲーション技術を構築した。中でも開発した視覚と言語情報を組み合わせるナビゲーション技術は、知識ベースの手法とデータ駆動の手法を組み合わせた技術であり、今後のさまざまな展開の可能性があり、学術的価値が高いと考える。一方で、本課題のようにアクションを生成するという複雑な問題においては、人と同等のレベルに達するにはさらなる改良が必要であることもわかってきている。本課題で得た成果や知見は、行動知能の研究分野を推し進めていくうえで重要な学術的な意義があり、この分野におけるさらなる発展にも寄与すると考える。

URL: 

Published: 2022-01-27  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi