Accurate Estimation of Orientation and Location by Sensor Fusion
Project/Area Number |
15300053
|
Research Category |
Grant-in-Aid for Scientific Research (B)
|
Allocation Type | Single-year Grants |
Section | 一般 |
Research Field |
Perception information processing/Intelligent robotics
|
Research Institution | Chiba University |
Principal Investigator |
ZEN Heitoh Chiba University, Institute of Media and Information Technology, Prof., 総合メディア基盤センター, 教授 (20216568)
|
Co-Investigator(Kenkyū-buntansha) |
IMIYA Jun Chiba University, Institute of Media and Information Technology, Prof., 総合メディア基盤センター, 教授 (10176505)
IMAIZUMI Takashi Chiba University, Institute of Media and Information Technology, Associate Prof., 総合メディア基盤センター, 助教授 (70242287)
HASEGAWA Tameharu Chiba Institute of Technology, Faculty of Information and Computer Science, Assistant Prof., 情報科学部, 講師 (40348363)
|
Project Period (FY) |
2003 – 2004
|
Project Status |
Completed (Fiscal Year 2004)
|
Budget Amount *help |
¥10,900,000 (Direct Cost: ¥10,900,000)
Fiscal Year 2004: ¥5,400,000 (Direct Cost: ¥5,400,000)
Fiscal Year 2003: ¥5,500,000 (Direct Cost: ¥5,500,000)
|
Keywords | sensor fusion / augmented reality / three dimensional map / 画像処理 |
Research Abstract |
In this research, the sensor fusion method for the estimation of camera's pose and location in outdoor environment is developed. The pose and the location of a camera (viewer), are the indispensable in Augmented Reality (AR) systems. AR is a growing area in virtual reality research that generates a composite view for the user. It is a combination of the real scene viewed by the user and a virtual scene generated by the computer that augments the scene with additional information. In this research, the hybrid sensing system for outdoor AR system is proposed, that enables accurate positioning and the pose estimation of the viewer. GPS and gyro sensing are used as the initial estimation. The vision technique is employed for estimate the accurate position and pose. The boundary edges between the buildings and the sky (i.e. skylines) is chosen as the image derived feature, for the fine search. Matching is done between the image derived boundary edges and the synthesized one from 3D maps. The similarity is defined as DP matching score. The method is implemented on laptop PC and evaluation is done through the site, the campus of Chiba University. The three dimensional model of buildings in the site are pre-build using surveying instruments. The estimated pose and position are seem to be sufficient for annotate the buildings in the typical campus scenes. In the outdoor, pictures are often blurred or out of focus. Some image processing techniques are tried to extract the skylines, and the improvements of the estimations has been proved.
|
Report
(3 results)
Research Products
(69 results)
-
-
[Journal Article] 複数視点映像による交通監視2005
Author(s)
久保田英和, 全 へい東, 長谷川為春
-
Journal Title
情報処理学会研究報告(コンピュータビジョンとイメージメディア 2005-CVIM-148
Pages: 175-180
NAID
Description
「研究成果報告書概要(和文)」より
Related Report
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-