• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to previous page

Analysis of Object Shape and Illumination from Image Sequence Using Unconventional Models

Research Project

Project/Area Number 20K11866
Research Category

Grant-in-Aid for Scientific Research (C)

Allocation TypeMulti-year Fund
Section一般
Review Section Basic Section 61010:Perceptual information processing-related
Research InstitutionOkayama University

Principal Investigator

Migita Tsuyoshi  岡山大学, 環境生命自然科学学域, 助教 (90362954)

Project Period (FY) 2020-04-01 – 2024-03-31
Project Status Completed (Fiscal Year 2023)
Budget Amount *help
¥3,510,000 (Direct Cost: ¥2,700,000、Indirect Cost: ¥810,000)
Fiscal Year 2022: ¥1,170,000 (Direct Cost: ¥900,000、Indirect Cost: ¥270,000)
Fiscal Year 2021: ¥1,300,000 (Direct Cost: ¥1,000,000、Indirect Cost: ¥300,000)
Fiscal Year 2020: ¥1,040,000 (Direct Cost: ¥800,000、Indirect Cost: ¥240,000)
Keywordsパラメータ推定 / 画像列 / 非線形最適化 / ヤコビ行列 / レイトレーシング / 照度差ステレオ / ニューラルフィールド / Structure from Motion / GPU / 逆問題 / 未校正照度差ステレオ法 / 3次元形状復元 / 物体形状復元 / 画像生成モデル
Outline of Research at the Start

本研究では,画像列に映った物体の3次元形状を推定する形状復元法(未校正照度差ステレオ法)の高度化を目指す.基本的には,計算機上で生成されたCG画像と物理的に撮影された画像を一致させるようにCGのパラメータを推定することで,物体の形状や色彩と光源等の環境要因を分離する,この際,近年発展の著しいCG技術や,深層学習技術等を利用可能な枠組みの構築を検討する.環境の影響を受けずに物体の情報を安定に得ることは,様々な応用において有用である.

Outline of Final Research Achievements

In this research, we have developed a method for estimating parameters, such as object shape and illumination, from a given set of images. The formulation is
minimization of the differences between given and synthesized images. The synthesys employs the ray-tracing algorithm, which enables us to exploit reflections and shadows, which are often considered to be harmful for estimation, in estimating shape and/or illumination parameters. The method also make use of multiple viewpoints. To do so, a new image representation model is introduced, which enables us to efficiently calculate the image and its gradient with respect to locally-affected parameters, i.e. texture intensity values. This is combined with difference-quotient-based approximation w.r.t. globally-affected parameters, i.e. object shape and light positions, to construct a joint estimation method.

Academic Significance and Societal Importance of the Research Achievements

画像に映った物体は,カメラや光源の位置によって見え方が大きく変化するため,これらの影響を取り除くことが,物体の形状や色彩を正確に推定する上で重要である.これが可能であれば,画像からの3次元モデル生成(写真測量)や人体の状態の解析等に利用でき,HCI,個人認証,自動運転等の基礎としても有用である.
本研究では,近年普及が進むハードウエア支援レイトレーシング法と深層学習系の手法(ともにGPUを利用する)を組み込んだ画像解析を検討し,更なる研究の基礎となる技術を確立した.

Report

(5 results)
  • 2023 Annual Research Report   Final Research Report ( PDF )
  • 2022 Research-status Report
  • 2021 Research-status Report
  • 2020 Research-status Report
  • Research Products

    (4 results)

All 2024 2023 2022 2021

All Presentation (4 results) (of which Int'l Joint Research: 3 results)

  • [Presentation] Estimation of Neural Field Texture and Scene Parameters using Differentiable Raytracing2024

    • Author(s)
      Tsuyoshi MIGITA, Norikazu TAKAHASHI
    • Organizer
      The 30th International Workshop on Frontiers of Computer Vision (IW-FCV2024)
    • Related Report
      2023 Annual Research Report
    • Int'l Joint Research
  • [Presentation] 微分可能なレイトレーシングによる物体形状・テクスチャ・光源の推定2023

    • Author(s)
      右田 剛史, 高橋 規一
    • Organizer
      電子情報通信学会PRMU研究会
    • Related Report
      2023 Annual Research Report
  • [Presentation] Uncalibrated Photometric Stereo using Superquadrics with Texture Estimation2022

    • Author(s)
      Tsuyoshi Migita, Ayane Okada and Norikazu Takahashi
    • Organizer
      IW-FCV2022, The 28th International Workshop on Frontiers of Computer Vision
    • Related Report
      2021 Research-status Report
    • Int'l Joint Research
  • [Presentation] Uncalibrated Photometric Stereo using Superquadrics with Cast Shadow2021

    • Author(s)
      T. Nasu, T. Migita and N. Takahashi
    • Organizer
      The 27th International Workshop on Frontiers of Computer Vision (IW-FCV2021)
    • Related Report
      2020 Research-status Report
    • Int'l Joint Research

URL: 

Published: 2020-04-28   Modified: 2025-01-30  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi