• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to previous page

Feature extraction from vibrotactile data using autoencoder

Research Project

Project/Area Number 21K18680
Research Category

Grant-in-Aid for Challenging Research (Exploratory)

Allocation TypeMulti-year Fund
Review Section Medium-sized Section 18:Mechanics of materials, production engineering, design engineering, and related fields
Research InstitutionKeio University

Principal Investigator

Takemura Kenjiro  慶應義塾大学, 理工学部(矢上), 教授 (90348821)

Project Period (FY) 2021-07-09 – 2023-03-31
Project Status Completed (Fiscal Year 2022)
Budget Amount *help
¥6,370,000 (Direct Cost: ¥4,900,000、Indirect Cost: ¥1,470,000)
Fiscal Year 2022: ¥2,340,000 (Direct Cost: ¥1,800,000、Indirect Cost: ¥540,000)
Fiscal Year 2021: ¥4,030,000 (Direct Cost: ¥3,100,000、Indirect Cost: ¥930,000)
Keywords触感センシング / 機械学習 / オートエンコーダ / 畳み込みニューラルネットワーク / ウェーブレット変換 / 触感 / 特徴量抽出 / 振動 / 触感知覚 / 特徴量
Outline of Research at the Start

本研究では,オートエンコーダによる機械学習を用いて対象物を触察した際の振動データから触感の違いを表す特徴量を抽出し,ヒトが知覚する触感を精度よく計測する手法を確立することを目的とする.はじめに,触感センサによって対象物を触察した際の振動データを取得し,オートエンコーダによって振動データの違いを表す特徴量を抽出する.つぎに,対象物の官能評価実験を行い,線形多変量解析あるいは非線形なオートエンコーダによって官能評価結果を次元圧縮する.最後に振動データの違いを表す特徴量によって,官能評価結果の次元圧縮結果を説明する触感推定モデルを構築する.

Outline of Final Research Achievements

Vibration information was obtained when a tactile sensor traced on a sample. Then, features were extracted from the obtained vibration data using the deep autoencoder. Next, we obtained tactile scores when a subject touched the same samples in a sensory evaluation experiment with a seven-scale semantic differential method, followed by constructing a tactile estimation model with the acquired features as input and the sensory evaluation scores as output. In addition to this model, feature vectors were extracted from the vibration data by using a convolutional neural network. In this case, we introduced wavelet transformation in order to obtain a scalogram, two-dimensional image, of the vibration data. Note that a convolutional neural network is suitable for extracting features from two-dimensional image. As a result, it is shown that the tactile score can be predicted with an error of less than one point from the average value of sensory evaluation.

Academic Significance and Societal Importance of the Research Achievements

本研究では,機械学習を援用することにより,触感センサから得られた機械的データに適した特徴量の抽出が触感推定に有効であることを示した.これにより,多くの研究では見逃されていた振動データに内在した潜在的な特徴量を顕在化でき,触感研究を変革する可能性を秘めている.すなわち,1990年代から長く研究され,触感研究の根幹を成すにも関わらず,未だに確立されていない定量的触感計測を可能にする意義は大きい.また,社会実装が加速する機械学習の適用範囲拡大の挑戦としても意義深い.すなわち,本研究の成果は画像データに広くて適用される機械学習を触感データへの適用に拡張する可能性を示している.

Report

(3 results)
  • 2022 Annual Research Report   Final Research Report ( PDF )
  • 2021 Research-status Report
  • Research Products

    (3 results)

All 2022 2021

All Journal Article (2 results) (of which Peer Reviewed: 2 results,  Open Access: 2 results) Presentation (1 results) (of which Int'l Joint Research: 1 results)

  • [Journal Article] Nonlinear Tactile Estimation Model Based on Perceptibility of Mechanoreceptors Improves Quantitative Tactile Sensing2022

    • Author(s)
      Sagara Momoko、Nobuyama Lisako、Takemura Kenjiro
    • Journal Title

      Sensors

      Volume: 22 Issue: 17 Pages: 6697-6697

    • DOI

      10.3390/s22176697

    • Related Report
      2022 Annual Research Report
    • Peer Reviewed / Open Access
  • [Journal Article] A Model for Estimating Tactile Sensation by Machine Learning Based on Vibration Information Obtained while Touching an Object2021

    • Author(s)
      Fumiya Ito, Kenjiro Takemura
    • Journal Title

      Sensors

      Volume: 21 Issue: 23 Pages: 7772-7772

    • DOI

      10.3390/s21237772

    • Related Report
      2021 Research-status Report
    • Peer Reviewed / Open Access
  • [Presentation] Texture classification model based on temporal changes in vibration using wavelet transform2022

    • Author(s)
      Momoko Sagara, Kenjiro Takemura
    • Organizer
      IEEE SENSORS 2022
    • Related Report
      2022 Annual Research Report
    • Int'l Joint Research

URL: 

Published: 2021-07-13   Modified: 2024-01-30  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi