• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to project page

2019 Fiscal Year Final Research Report

Modeling of "grace" characteristics based on dance motion -Motion and Form-

Research Project

  • PDF
Project/Area Number 17K00393
Research Category

Grant-in-Aid for Scientific Research (C)

Allocation TypeMulti-year Fund
Section一般
Research Field Kansei informatics
Research InstitutionOsaka Institute of Technology

Principal Investigator

UEDA Etsuko  大阪工業大学, ロボティクス&デザイン工学部, 教授 (90379529)

Co-Investigator(Kenkyū-buntansha) 小枝 正直  大阪電気通信大学, 総合情報学部, 准教授 (10411232)
竹村 憲太郎  東海大学, 情報理工学部, 准教授 (30435440)
中村 恭之  和歌山大学, システム工学部, 教授 (50291969)
飯田 賢一  奈良工業高等専門学校, 電子制御工学科, 教授 (70290773)
Project Period (FY) 2017-04-01 – 2020-03-31
Keywords優美な動作 / モーションキャプチャ / 動作解析 / 古典舞踊
Outline of Final Research Achievements

We aim at inducing a good impression by extracting the "graceful" features from the dance movements and implementing them in a robot. During the research period, the following two points were made: (1) We proposed an extraction method of the curvature change of the surface created by the arm in space. We characterized it using a Gaussian map. (2) We performed an analysis considering both "form" and "motion". The impression is influenced by the movement of the limbs. We have developed a method to extract limb movements from the dance movements and analyze the S-shaped trajectory only using limb movements. In addition, in order to handle "motion" and "form" at the same time, we have developed a method of representing motion as a spatio-temporal volume. By using this method, we can analyze simple motion.

Free Research Field

ロボット工学

Academic Significance and Societal Importance of the Research Achievements

人間の動作をもとにしたロボット動作生成の研究分野において,芸術領域・美学分野にまたがり 「優美さ」に着目し,その優美さを定量化しモデル化する手法を提案する研究は他に例がなく,これまでにない独創的なアプローチとなる.
優美さをモデル化することができれば,ロボット動作生成などの,コミュニケーションへの応用だけでなく,一般的な動作の評価基準として使用することが可能となり,リハビリの効果測定などへの応用も期待でき,人間のQOLの向上にも役立つ.

URL: 

Published: 2021-02-19  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi