• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to project page

2001 Fiscal Year Final Research Report Summary

A Study of Quantitative Evaluation Method of Emotion using Facial Expression and Electroencephalogram

Research Project

Project/Area Number 09650495
Research Category

Grant-in-Aid for Scientific Research (C)

Allocation TypeSingle-year Grants
Section一般
Research Field 計測・制御工学
Research InstitutionTOYOTANATIONAL COLLEGE OF TECHNOLOGY

Principal Investigator

OZEKI Osamu  Toyota National College of 'Technology,Department of Electrie and Electronic Engineering, Professor, 電気電子システム工学科, 教授 (50280392)

Co-Investigator(Kenkyū-buntansha) KATO Shohei  Toyota National College of "Technology, Department of Electric and Electronic Engineering, Lecturer, 電気電子システム工学科, 講師 (70311032)
YOKOYAMA Kiyoko  Nagoya City University.School of Design and Architecture, Department of Visual and Urban Design, Assistant Professor, 芸術工学部・視覚情報デザイン学科, 助教授 (50174868)
WATANABE Yosaku  toyota National College of Technology , Department of General Education, Professor, 一般学科, 教授 (00043191)
Project Period (FY) 1997 – 2000
Keywordsfacian expressin / heart rate / intensity of guman emotion / electroencephalogram / comfortable and uncomfortable emotion / power flow / 1 / f^m fluctuations / autoregressive model
Research Abstract

The first purpose of this study is to propose the estimating method of the intensity of human emotion for fun and pleasure using facial expression and physiological data. The index of facial expression is the transverse width of mouth and the indices of physiological data are heart rate and skin conductance level. The three indices were measured while subjects wore seeing comedy movies. The subjects roported their evaluation for fun and pleasure with four intensity levels(0, 1, 2 and 3) after movie. Using the throe indices, the estimation formula based on multiple regression model was derived. The estimated intensity was compared with reported one. The mean error of estimation was about 0.3, It was concluded that the proposed methocl would be able to estimate the intensity of human emotion for fun and pleasure. The second purpose of this study Is to evaluate the differonce of α wave(8Hz 〜 13Hz) of electroenceph ilogram(EEG) on comfortable and uncomfortable emotron usmg three mdices - 1/f^m fluctuations, power of awave and power flow. The subjects were sixteen adults(20〜23 years old). The electrodes were attachedto Fpl Fp O_l 0_2 in the international l0/20 EEG system On 1/f^m fluctuations analysis ofFpl electrode F_<p1>, F_<p2>, O_1, O_2 mean value of m in the comfortable emotion was larger than uncomfortable one. This result was the same one as the precedrng research. On (xwave power 'analysis, there were no significant difference between comfortable and uncomfortable emotion in each four electrodes. Power f'low from one eleotrode to other three electrodos was calculated by autoregressive model. Experimental results showed that mean value of power flow in the uncomfortable emotion was larger th, an comfortable one on F_<p1>, F_<p2>, O_1, O_2.

  • Research Products

    (4 results)

All Other

All Publications (4 results)

  • [Publications] 堀, 小関, 横山, 渡辺: "表情と生体情報による感情の強さの推定-面白い・楽しいの場合-"電気学会論文誌C. 119-C. 668-675 (1999)

    • Description
      「研究成果報告書概要(和文)」より
  • [Publications] 小関, 加藤, 山口, 清水: "多次元自己回帰モデルを用いた脳波のパワー寄与率解析についての一検討"豊田工業高等専門学校研究紀要. 34. 19-24 (2001)

    • Description
      「研究成果報告書概要(和文)」より
  • [Publications] Estimation of Intensity of Human Emotion using Facial Expression and Physiological Data - Case of Fun and Pleasure -: Journal of The Institute of Electrical Engineers of Japan. Vol.119-C, No.6. 668-675 (1999)

    • Description
      「研究成果報告書概要(欧文)」より
  • [Publications] Osamu OZEKI, Syohei KATO, Nobuhisa YAMAGUCHI and Norihiro SHIMIZU: "A Study of Relative Power Contribution Analysis of Electroencephalogram Using Multi Dimensional Autoregressive Model"Bulletin of Toyota National College of Technology. Vol.34. 19-24 (2001)

    • Description
      「研究成果報告書概要(欧文)」より

URL: 

Published: 2003-09-17  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi