• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to project page

2023 Fiscal Year Final Research Report

Interactive humanoid robot design using Bunraku puppet structure and Jo-ha-kyu Principle

Research Project

  • PDF
Project/Area Number 21K17833
Research Category

Grant-in-Aid for Early-Career Scientists

Allocation TypeMulti-year Fund
Review Section Basic Section 61050:Intelligent robotics-related
Research InstitutionChukyo University (2023)
Tokyo University of Technology (2021-2022)

Principal Investigator

Dong Ran  中京大学, 工学部, 講師 (80879891)

Project Period (FY) 2021-04-01 – 2024-03-31
Keywordsロボティクス / 周波数解析 / 深層学習 / 人形浄瑠璃 / ヒューマンロイド / インタラクションデザイン / 伝統芸能 / 序破急
Outline of Final Research Achievements

This study aims to apply the movements and emotional expressions of the Japanese traditional puppet theater "Ningyo Joruri Bunraku" to robotics. Using motion capture, we analyzed the puppet's movements and identified techniques for emotional expression. Specifically, we confirmed that the puppet's stretching and chest joint movements synchronize with the voice of the Gidayu (narrative chanting) and the sound of the shamisen (three-stringed instrument), and are related to the mechanism of Jo-Ha-Kyu (a traditional Japanese concept of modulation). We implemented these Bunraku puppet mechanisms into a robot, developing a robot capable of expressing the stretching techniques. Furthermore, we proposed a method combining motion analysis in the frequency domain using the Hilbert-Huang Transform with deep learning, enabling the robot to perform the delicate movements of Bunraku puppets, optimized to the motor performance.

Free Research Field

知能ロボティクス

Academic Significance and Societal Importance of the Research Achievements

本研究は,文楽人形のカラクリをロボットの構造に取り入れることで,初めて序破急メカニズムに合わせた伸縮技法を表現できるロボットの開発を試みた.得られた知見は,ロボットインタラクションデザインに日本の伝統芸能の視点から新たなフレームワークを提供できる.また,モーション生成における周波数空間と深層学習の融合は,ヒューマノイドの自立的なインタラクション生成に寄与し,今後,人間社会に導入されるAIアシスタントの普及に貢献できると考えられる.

URL: 

Published: 2025-01-30  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi