• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to previous page

Emotional communication between human and machine using motion of joints of body

Research Project

Project/Area Number 17500140
Research Category

Grant-in-Aid for Scientific Research (C)

Allocation TypeSingle-year Grants
Section一般
Research Field Sensitivity informatics/Soft computing
Research InstitutionUniversity of Toyama

Principal Investigator

ISHII Masahiro  University of Toyama, Graduate school of science and engineering, Associate Professor (10272717)

Co-Investigator(Kenkyū-buntansha) SATO Makoto  Tokyo Institute of Technology, Precision and Intelligence Laboratory, Professor (50114872)
KOIKE Yasuharu  Tokyo Institute of Technology, Precision and intelligence Laboratory, Associate (10302978)
HASEGAWA Syoichi  The University of Electro-Communications, Department of Mechanical Engineering and Intelligent Systems, Associate Professor (10323833)
MARUTA Hidenori  Nagasaki University, Information Media Center, Research Associate (00363474)
Project Period (FY) 2005 – 2006
Project Status Completed (Fiscal Year 2006)
Budget Amount *help
¥2,700,000 (Direct Cost: ¥2,700,000)
Fiscal Year 2006: ¥1,000,000 (Direct Cost: ¥1,000,000)
Fiscal Year 2005: ¥1,700,000 (Direct Cost: ¥1,700,000)
Keywordsemotion / gait / biological motion / 動作 / マンマシンインタラクション / パターン認識
Research Abstract

The purpose of this research was to achieve emotional communication between humans and robots. Numerous attempts have been made by scholars to show that the expressions of feces and voices are effective in receiving and manifesting emotion between humans and robots. Instead of feces and voices, we use body movement as a cue to emotion. We have obtained following results.
We have built a movie database for emotion study. A motion-capturing system was used to get human gait information. 12 sensors were put onto walker's principal joints. Actors were asked to express emotion while walking. The database includes gait information from 30 actors.
We investigated human percept on emotion from gait display. The gait data were presented as biological motion. Human subjects were asked to answer perceived emotion from the display. The rate of correct answer was 40%.
We constructed an emotion-recognition system. A method of feature extraction to discriminate emotions of human from a sensing data of human gait patterns was studied. We assume that the high dimensional biological motion data are generated by low dimensional features which components are statistically independent. The extracted feature is evaluated by a discriminated result of the given biological motion data which identified five types of categories, "Anger", "Grief", "Disgust", "Joy" and "Fear". We achieve 40% accuracy for 5-class of emotion discrimination with 3 actors' biological motion data.

Report

(3 results)
  • 2006 Annual Research Report   Final Research Report Summary
  • 2005 Annual Research Report
  • Research Products

    (11 results)

All 2007 2006 2005 Other

All Journal Article (10 results) Book (1 results)

  • [Journal Article] Feature extraction from Biological motion of human gait patterns for emotion discrinibation2007

    • Author(s)
      Hidebori Maruta, Masahiro Ishii
    • Journal Title

      IAPR 10th Conference on Machine Vision Applications

    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2006 Final Research Report Summary
  • [Journal Article] 動画像における視覚的注意モデルの構築2007

    • Author(s)
      釣大輔,石井雅博,唐政,山下和也
    • Journal Title

      コンピュータビジョンとイメージメディア研究会

    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2006 Final Research Report Summary
  • [Journal Article] アニマシー知覚を利用した生体検出2007

    • Author(s)
      松田恭平,石井雅博,唐政,山下和也
    • Journal Title

      コンピュータビジョンとイメージメディア研究会

    • NAID

      110006403814

    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2006 Final Research Report Summary
  • [Journal Article] Feature extraction from Biological motion of human gait patterns for emotion discrimination2007

    • Author(s)
      Hidenori Maruta, Masahiro Ishii
    • Journal Title

      IAPR 10th Conference of Machine Vision Applications

    • Related Report
      2006 Annual Research Report
  • [Journal Article] 感情識別のための独立成分分析によるバイオロジカルモーションデータからの特徴抽出2006

    • Author(s)
      丸田 英徳, 石井 雅博
    • Journal Title

      電子情報通信学会技術研究報告 Vol.105,No.534

      Pages: 135-139

    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2006 Final Research Report Summary
  • [Journal Article] バイオロジカルモーションからの感情知覚に関する研究2006

    • Author(s)
      石井 雅博, 丸田 英徳
    • Journal Title

      電子情報通信学会技術研究報告 Vol.105,No.165

      Pages: 123-126

    • NAID

      110003272667

    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2006 Final Research Report Summary
  • [Journal Article] 感情識別のための独立成分分析によるバイオロジカルモーションデーターからの特徴抽出2006

    • Author(s)
      丸田英徳, 石井雅博
    • Journal Title

      電子情報通信学会技術研究報告 Vol.105,No.534

      Pages: 135-139

    • Related Report
      2006 Annual Research Report
  • [Journal Article] 感情識別のためのICAによるバイオロジカルモーションデータからの特徴抽出について2006

    • Author(s)
      丸田英徳, 石井雅博
    • Journal Title

      電子情報通信学会信学技報 PRMU2005-172

      Pages: 135-140

    • NAID

      110004075758

    • Related Report
      2005 Annual Research Report
  • [Journal Article] バイオロジカルモーションからの感情知覚に関する研究2005

    • Author(s)
      石井雅博, 丸田英徳
    • Journal Title

      電子情報通信学会信学技報 HIP2005-42

      Pages: 123-126

    • NAID

      110003272667

    • Related Report
      2005 Annual Research Report
  • [Journal Article] Feature extraction from Biological motion of human gait patterns for emotion discrimination IAPR 10th Conference on Machine Vision Applications

    • Author(s)
      Hidenori Maruta, Masahiro Ishii
    • Description
      「研究成果報告書概要(欧文)」より
    • Related Report
      2006 Final Research Report Summary
  • [Book] だまされる脳2006

    • Author(s)
      日本VR学会VR心理学究会編
    • Total Pages
      200
    • Publisher
      講談社
    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2006 Final Research Report Summary

URL: 

Published: 2005-04-01   Modified: 2016-04-21  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi