• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to previous page

A Study on automatic construction of virtual space from natural scenes and non-contact access

Research Project

Project/Area Number 07555119
Research Category

Grant-in-Aid for Scientific Research (A)

Allocation TypeSingle-year Grants
Section試験
Research Field 情報通信工学
Research InstitutionNagoya University

Principal Investigator

TANIMOTO Masayuki  Nagoya University, Faculty of Engineering, Professor, 工学部, 教授 (30109293)

Co-Investigator(Kenkyū-buntansha) MATSUDA Kiichi  Fujitsu laboratories, Ltd., Media Processing Lab., Manager, Researcher, メディア処理研究部・部長, 研究員
Project Period (FY) 1995 – 1996
Project Status Completed (Fiscal Year 1996)
Budget Amount *help
¥7,100,000 (Direct Cost: ¥7,100,000)
Fiscal Year 1996: ¥1,700,000 (Direct Cost: ¥1,700,000)
Fiscal Year 1995: ¥5,400,000 (Direct Cost: ¥5,400,000)
Keywordsthree dimension scene / virtual space / multi-viewpoint image set / depth information / non-contact man-machine interface / virtual space access / gesture interface / analysis of three dimensional motion
Research Abstract

We proposed and constructed the "3D editor" with which we can access the virtual space composed from a natural scene in the computer.
We developed the following three algorithm as key technologies of this system.
(1) Construction of virtual space from natural scenes
We developed a system with which the depth information of three dimensional natural scenes is acquired from a multi-viewpoint image set which is input through several cameras placed (or one camera moving) on a straight line.
In the proposal system the depth information is obtained accurately by dividing the three dimensional structure into multi layrs with different disparity, and correcting a wrong disparity caused by occlusion from near to far.
In addition, we developed an algorithm which composes the virtual three dimensional space by interpolating the obtained depth information.
(2) Construction of gesture-based man-machine interface
We cunstructed a compact gesture-based man-machine interface. Hand gesture is used to access the virtual three dimensional space.
Two cameras are set on the top of the monitor and users input information by moving their hand in front of the monitor.
The stereo moving scene of the hand is analyzed in the workstation and the hand motion in the three dimensional space is detected.
In the proposed system, the motion and form of the hand are detetected indeopendently of user's clothes and background.
(3) Command input by gesture
We developed an algorithm to recognize commands from the detected three dimensional motion of the hand. In the proposal system, the gesture is recognized by decomposing a continuous operation into individual operation units, and matching the unit with the model.
Various three dimensional commands can be interpreted by this algorithm.

Report

(3 results)
  • 1996 Annual Research Report   Final Research Report Summary
  • 1995 Annual Research Report
  • Research Products

    (4 results)

All Other

All Publications (4 results)

  • [Publications] Takashi Imori: "Acquisition and Error Correction of Depth Information Using a Multi-Viewpoint Image Set" HDTV′95 PROCEEDINGS. HDTV95. 7B-9-7B-16 (1995)

    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      1996 Final Research Report Summary
  • [Publications] T.Imori, H.Morinaga, T.Fujii, T.Kimoto, M.Tanimoto: "Acquisition and Error Correction of Depth Information Using a Multi-Viewpoint Image Set" HDTV'95 PROCEEDINGS. 7B-9-7B-16 (1995)

    • Description
      「研究成果報告書概要(欧文)」より
    • Related Report
      1996 Final Research Report Summary
  • [Publications] Takashi Imori: "Acquisition and Error Correction of Depth Information Using a Multi-Viewpoint Image Set" HDTV'95 PROCEEDINGS. HDTV95. 7B-9 - 7B-16 (1995)

    • Related Report
      1996 Annual Research Report
  • [Publications] Takashi Imori: "Acquisition and Error Correction of Depth Information Using a Multi-Viewpoint Image Set" HDTV'95 PROCEEDINGS. HDTV95. 7B-9-7B-16 (1995)

    • Related Report
      1995 Annual Research Report

URL: 

Published: 1995-04-01   Modified: 2016-04-21  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi