• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to previous page

The study which 3D data of the dynamic human body obtained by photogrammetry

Research Project

Project/Area Number 18K00211
Research Category

Grant-in-Aid for Scientific Research (C)

Allocation TypeMulti-year Fund
Section一般
Review Section Basic Section 01070:Theory of art practice-related
Research InstitutionKyoto City University of Arts

Principal Investigator

YOSHIOKA TOSHINAO  京都市立芸術大学, 美術学部/美術研究科, 准教授 (80329870)

Co-Investigator(Kenkyū-buntansha) 村上 史明  筑波大学, 芸術系, 助教 (30512884)
Project Period (FY) 2018-04-01 – 2022-03-31
Project Status Completed (Fiscal Year 2021)
Budget Amount *help
¥4,420,000 (Direct Cost: ¥3,400,000、Indirect Cost: ¥1,020,000)
Fiscal Year 2020: ¥390,000 (Direct Cost: ¥300,000、Indirect Cost: ¥90,000)
Fiscal Year 2019: ¥780,000 (Direct Cost: ¥600,000、Indirect Cost: ¥180,000)
Fiscal Year 2018: ¥3,250,000 (Direct Cost: ¥2,500,000、Indirect Cost: ¥750,000)
Keywordsフォトグラメトリー / 3DCG / photogrammetry / 写真測量 / 人体 / 3DCG / 4D / 芸術表現 / アニメーション
Outline of Final Research Achievements

Photogrammetry technology uses photographs of objects. By verifying the shooting method, camera settings, lighting, etc., we were able to derive the basic data for taking the optimum photo for photogrammetry.
The success or failure of photogrammetry depends not only on the development of applications but also on the photographs prepared. As a result, research results useful for improving photogrammetry were obtained. In order to utilize the obtained 3D data of the human body in the field of expression, I thought that it was necessary to verify the data browsing from the creator's point of view. Optimal speed for rotation and movement by mouse operation, wide-angle and telephoto distortion peculiar to camera eyes. We verified the operation feeling of 4D data by adding the time axis to 3D data, and derived the optimum viewing environment.

Academic Significance and Societal Importance of the Research Achievements

人間の動きの一瞬を切り取り、オリジナルのキャラクターに置き換え、それを誇張させ、静止画として描く。そういった創作活動はプロアマ問わず一般化している。しかし、人体の動作の一瞬を立体として捉える技術や環境は未だ発展の途上といえる。本研究では、まず、フォトグラメトリー技術を使って人体の動作の一瞬を3Dデータとして取得する上で、最良の測量結果が得られる写真撮影方法を検証し、アプリケーションに依る事の無い、写真と計測結果の相関関係を導きだせた。
また、そのデータを専門的な知識がなくとも、任意に調整できるWEBブラウザを開発し、創作者がストレスなく利用できる3Dデータ閲覧環境の考察を行った。

Report

(5 results)
  • 2021 Annual Research Report   Final Research Report ( PDF )
  • 2020 Research-status Report
  • 2019 Research-status Report
  • 2018 Research-status Report
  • Research Products

    (8 results)

All 2021 Other

All Presentation (1 results) (of which Int'l Joint Research: 1 results) Remarks (7 results)

  • [Presentation] Process for creating a lithograph by laser and drawing machine.2021

    • Author(s)
      吉岡俊直
    • Organizer
      INPACT11 international printmaking conference in HongKong
    • Related Report
      2019 Research-status Report
    • Int'l Joint Research
  • [Remarks] フォトグラメトリーによる動的人体の3Dデータ取得と享受の研究

    • URL

      http://photogrammetry.work/

    • Related Report
      2021 Annual Research Report
  • [Remarks] 吉岡俊直研究ポータル 4Dデータブラウザ

    • URL

      http://photogrammetry.work/web-3d-viewer/4d-browser.html

    • Related Report
      2021 Annual Research Report
  • [Remarks] 吉岡俊直研究ポータル フォトグラメトリー技術を利用した平面作品

    • URL

      http://photogrammetry.work/yosioka%20HP/home.html

    • Related Report
      2021 Annual Research Report
  • [Remarks] 吉岡俊直研究ポータル 3Dデータのブラウジングページ

    • URL

      http://photogrammetry.work/web-3d-viewer/3d-browser.html

    • Related Report
      2021 Annual Research Report
  • [Remarks] フォトグラメトリーによる動的人体の3Dデータ取得と享受の研究

    • URL

      http://photogrammetry.work/

    • Related Report
      2020 Research-status Report
  • [Remarks] フォトグラメトリーによる 動的人体の3Dデータ取得と 享受の研究

    • URL

      http://www.photogrammetry.work/

    • Related Report
      2019 Research-status Report
  • [Remarks] フォトグラメトリーによる動的人体の3Dデータ取得と享受の研究

    • URL

      http://www.photogrammetry.work

    • Related Report
      2018 Research-status Report

URL: 

Published: 2018-04-23   Modified: 2023-01-30  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi