• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to previous page

Learning 3D information and ego-motion from acoustic video in extreme underwater environment

Research Project

Project/Area Number 23K19993
Research Category

Grant-in-Aid for Research Activity Start-up

Allocation TypeMulti-year Fund
Review Section 1002:Human informatics, applied informatics and related fields
Research InstitutionThe University of Tokyo

Principal Investigator

オウ ギョクセイ  東京大学, 大学院工学系研究科(工学部), 助教 (00984270)

Project Period (FY) 2023-08-31 – 2025-03-31
Project Status Granted (Fiscal Year 2023)
Budget Amount *help
¥2,860,000 (Direct Cost: ¥2,200,000、Indirect Cost: ¥660,000)
Fiscal Year 2024: ¥1,430,000 (Direct Cost: ¥1,100,000、Indirect Cost: ¥330,000)
Fiscal Year 2023: ¥1,430,000 (Direct Cost: ¥1,100,000、Indirect Cost: ¥330,000)
KeywordsAcoustic camera / 2D forward looking sonar / Deep learning / Self-supervised learning / 3D reconstruction
Outline of Research at the Start

This research aims to develop a novel method to understand the surrounding environment and estimate ego-motion of an underwater robot simultaneously using acoustic camera, by utilizing state-of-the-art deep learning techniques. Both simulation and field experiments will be carried out.

Outline of Annual Research Achievements

Our goal is to estimate 3D information and motion from acoustic video in a supervised manner. During this fiscal year, 3D information was successfully derived from acoustic image sequences using self-supervised learning techniques. The results met our expectations. Our first paper was published at the prestigious robotics conference, IROS 2023. The next step is to estimate motion from the acoustic video. To achieve this, we extensively explored the theoretical aspects by developing a comprehensive geometric model and verifying its feasibility through rigorous testing. Early results have demonstrated the viability of the method, and we are currently drafting a detailed paper on this work. We conducted an experiment, successfully gathering data from a large-scale water tank, which will further support our findings. We also updated our simulator to make the acoustic images more realistic for better evaluation. One corresponding paper is being written on the simulator, aiming to enhance understanding and application of these techniques in practical scenarios. This additional simulator research promises to significantly bolster our analytical capabilities.

Current Status of Research Progress
Current Status of Research Progress

2: Research has progressed on the whole more than it was originally planned.

Reason

This project aims to address two challenges simultaneously using acoustic video: first, estimating 3D information, and second, deriving motion data. The first challenge has been successfully resolved. We have now shifted our focus to the second challenge. Theoretical analyses and preliminary results have demonstrated the feasibility of our approach. This represents a significant breakthrough in the field, and we are confident that it will lead to a noteworthy publication.

Strategy for Future Research Activity

In the next phase, this project will concentrate on estimating motion information from acoustic video using a self-supervised method. Initially, we will revisit and refine the theoretical foundation. Following that, numerical calculations will be performed for verification purposes. Subsequently, we will conduct simulation experiments using synthetic images to evaluate our approach. Finally, we plan to carry out experiments in a water tank to collect a real dataset. After analyzing these results, we aim to compile and submit our findings to a top-tier robotics journal.

Report

(1 results)
  • 2023 Research-status Report
  • Research Products

    (2 results)

All 2024 2023

All Journal Article (1 results) (of which Peer Reviewed: 1 results) Presentation (1 results) (of which Int'l Joint Research: 1 results)

  • [Journal Article] Acoustic-N-Point for Solving 2D Forward Looking Sonar Pose Estimation2024

    • Author(s)
      Wang Yusheng、Ji Yonghoon、Tsuchiya Hiroshi、Ota Jun、Asama Hajime、Yamashita Atsushi
    • Journal Title

      IEEE Robotics and Automation Letters

      Volume: 9 Issue: 2 Pages: 1652-1659

    • DOI

      10.1109/lra.2024.3349968

    • Related Report
      2023 Research-status Report
    • Peer Reviewed
  • [Presentation] Motion Degeneracy in Self-supervised Learning of Elevation Angle Estimation for 2D Forward-Looking Sonar2023

    • Author(s)
      Yusheng Wang, Yonghoon Ji, Chujie Wu, Hiroshi Tsuchiya, Hajime Asama and Atsushi Yamashita
    • Organizer
      the 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS2023)
    • Related Report
      2023 Research-status Report
    • Int'l Joint Research

URL: 

Published: 2023-09-11   Modified: 2024-12-25  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi