Project/Area Number |
20K22383
|
Research Category |
Grant-in-Aid for Research Activity Start-up
|
Allocation Type | Multi-year Fund |
Review Section |
0301:Mechanics of materials, production engineering, design engineering, fluid engineering, thermal engineering, mechanical dynamics, robotics, aerospace engineering, marine and maritime engineering, and related fields
|
Research Institution | Chuo University (2021-2023) The University of Tokyo (2020) |
Principal Investigator |
|
Project Period (FY) |
2020-09-11 – 2024-03-31
|
Project Status |
Completed (Fiscal Year 2023)
|
Budget Amount *help |
¥2,860,000 (Direct Cost: ¥2,200,000、Indirect Cost: ¥660,000)
Fiscal Year 2021: ¥1,430,000 (Direct Cost: ¥1,100,000、Indirect Cost: ¥330,000)
Fiscal Year 2020: ¥1,430,000 (Direct Cost: ¥1,100,000、Indirect Cost: ¥330,000)
|
Keywords | 360-degree images / localization / image processing / Localization / Spherical Camera / 360 degree Camera / 360 camera / image gradients / 自己位置推定 / 360度センシング / 全天球カメラ |
Outline of Research at the Start |
A novel 6 DoF indoor localization method is proposed using a 360 degree camera. It is based on spherical image gradients, making it very fast. It will use an architectural map containing 3D line information. 2D lines will be detected inside images and compared to the 3D lines to localize the camera.
|
Outline of Final Research Achievements |
In this research, a novel method to use the 3D line information to perform 6 degree-of-freedom localization in indoor environments was proposed. This method was based on using 360-degree image gradient information to calculate 2D image lines and match them to 3D lines in the environmental model.
|
Academic Significance and Societal Importance of the Research Achievements |
This research made it possible to estimate 360-degree camera position and orientation at high speed using only image gradients. It showed that it was possible to use image gradients for detection in textureless environments without depending on explicit detection of lines or other features.
|