• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to project page

2023 Fiscal Year Final Research Report

A time advance estimation method for human motions using stochastic resonance and multiple EMG sensors

Research Project

  • PDF
Project/Area Number 18K04071
Research Category

Grant-in-Aid for Scientific Research (C)

Allocation TypeMulti-year Fund
Section一般
Review Section Basic Section 20020:Robotics and intelligent system-related
Research InstitutionOita University (2019-2023)
Sasebo National College of Technology (2018)

Principal Investigator

Sadahiro Teruyoshi  大分大学, 理工学部, 准教授 (40424676)

Project Period (FY) 2018-04-01 – 2024-03-31
Keywordsマルチセンサ / 確率共振 / 筋電位 / 電気力学的遅延 / 特徴量抽出 / 操作能力熟達
Outline of Final Research Achievements

Electromyography can be an efficient transmission signals of human intentions to machines when it is used as input signal to machines. In conventional researches, the time lag from the occurrence of electromyography to the actual human joint movement , it is called as the electromechanical delay (EMD), has been ignored. Therefore, The low pass filter are widely used as a preprocess for the electromyography, although it makes EMD shorter by its phase lag phenomenon. In this study, we has proposed a preprocessing method using the stochastic resonance and multi-sensors, it does not make EMD shorter. By combining some estimation models, methods to estimate human movements in advance of the EMD, it is approximately 100 to 200 ms, are proposed and experimentally validated for their usefulness.

Free Research Field

メカトロニクス

Academic Significance and Societal Importance of the Research Achievements

先行研究として EMD に着目し人間の運動の事前推定を行う研究を行ってはいたものの類似する研究はなく、そのような研究の発展として行われた。応募時に目的とした提案手法については実験的に検証を行うことができた。また研究を進めるにつれて、問題解決の糸口は入力の前処理をどのように行うかという観点と、推定モデルの選択にあることがわかった。そのため、提案手法以外にも2つの入力の前処理法と、3つの推定モデルを利用する方法を提案し、より事前に運動推定が可能で推定精度の良い手法の提案をおこなった。これら提案手法は、ますます重要になる VR 空間における視線移動時のVR酔い低減等に利用可能である。

URL: 

Published: 2025-01-30  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi