• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to project page

2004 Fiscal Year Final Research Report Summary

Experimental Research on Cognitive Mechanism of Facial Expression and Gaze

Research Project

Project/Area Number 13610085
Research Category

Grant-in-Aid for Scientific Research (C)

Allocation TypeSingle-year Grants
Section一般
Research Field 実験系心理学
Research InstitutionKyoto University

Principal Investigator

YOSHIKAWA Sakiko  Kyoto University, Graduate School of Education, Professor, 教育学研究科, 教授 (40158407)

Project Period (FY) 2001 – 2004
Keywordsfacial expression of emotion / gaze / emotion recognition / reflexive shift of attention / cuing paradigm / static vs. dynamic stimuli
Research Abstract

The main purpose of this research project is to elucidate the psychological mechanisms of processing social signals contained in a face. Especially, we focused on (1)the interaction between facial expression and face/gaze processing, and (2)the psychological mechanism of processing dynamic facial expression of emotion. Behavioral studies were carried out using matching accuracy, reaction time, and ratings and free responses etc. as dependent variables. The main results are as follows.
1 Using a match-to-sample paradigm, we examined whether the interaction between facial expression and face/gaze direction occurs in the early perceptual stages. The participants perceived a target briefly presented in their peripheral vision and were asked to choose the target face from the response panel. The results revealed that threatening faces looking toward the perceiver were processed more accurately than those looking away from the perceiver. This means the perceptual accuracy of the face with neg … More ative emotional signal depends on the attentional direction of the perceived person.
2 We examined whether the reflexive shifts of attention in response to face/gaze direction is accentuated or suppressed by the emotional signal shown on the face. The results showed that the emotion revealed by the facial expression affected the occurrence of reflexive shift of attention by the face/gaze cue and the exact pattern of the effect differed from one emotional expression to another. These experiments suggest that in the early perceptual processes, there is an interaction between facial expression and face/gaze direction.
3 Using computer-morphed animations stimuli of facial expressions of emotion, we examined how the rate at which the face changes affect processing of emotion that the face reveals. Using various experimental paradigms such as representational momentum, naturalness ratings, and free descriptions, we found that the velocity of the dynamic facial expression of emotion affected the perception, emotion category recognition, and inference of other person's psychological states. Less

  • Research Products

    (12 results)

All 2004 2003 2002 2001

All Journal Article (12 results)

  • [Journal Article] The amygdala processes the emotional significance of facial expressions : an fMRI investigation using the interaction between expression and face direction2004

    • Author(s)
      Sato, W., Yoshikawa, S., Kochiyama, T., Matsumura, M
    • Journal Title

      NeuroImage 22

      Pages: 1006-1013

    • Description
      「研究成果報告書概要(和文)」より
  • [Journal Article] The dynamic aspects of emotional facial expressions2004

    • Author(s)
      Sato, W., Yoshikawa, S
    • Journal Title

      Cognition and Emotion 18

      Pages: 701-710

    • Description
      「研究成果報告書概要(和文)」より
  • [Journal Article] The dynamic aspects of emotional facial expressions2004

    • Author(s)
      Sato W., Yoshikawa, S.
    • Journal Title

      Cognition and Emotion 18

      Pages: 701-710

    • Description
      「研究成果報告書概要(欧文)」より
  • [Journal Article] The amygdala processes the emotional significance of facial expressions : an fMRI investigation using the interaction between expression and face direction.2004

    • Author(s)
      Sato, W., Yoshikawa, S., Kochiyama, T., Matsumura, M.
    • Journal Title

      Neurolmage 22

      Pages: 1006-1013

    • Description
      「研究成果報告書概要(欧文)」より
  • [Journal Article] 表情認識における運動情報の処理2003

    • Author(s)
      吉川左紀子
    • Journal Title

      基礎心理学研究 22

      Pages: 76-83

    • Description
      「研究成果報告書概要(和文)」より
  • [Journal Article] Processing motion information in recognizing facial Expressions of emotion2003

    • Author(s)
      Sakiko Yoshikawa
    • Journal Title

      The Japanese Journal of Psychonomic Science 22

      Pages: 76-83

    • Description
      「研究成果報告書概要(欧文)」より
  • [Journal Article] 表情・視線認知の時空間特性:コミュニケーションの心理的基盤2002

    • Author(s)
      吉川左紀子
    • Journal Title

      精神神経学雑誌 104

      Pages: 125-132

    • Description
      「研究成果報告書概要(和文)」より
  • [Journal Article] Processing of facial expression and gaze : Psychological mechanisms of nonverbal communication2002

    • Author(s)
      Sakiko Yoshikawa
    • Journal Title

      Psychiatria et Neurologia Japonica 104

      Pages: 125-132

    • Description
      「研究成果報告書概要(欧文)」より
  • [Journal Article] 表情動画に対する自由記述の分析:情動カテゴリーおよび速度による差を中心に2001

    • Author(s)
      吉川左紀子, 佐藤弥
    • Journal Title

      京都大学教育学部紀要 47

      Pages: 51-68

    • Description
      「研究成果報告書概要(和文)」より
  • [Journal Article] 「研究成果報告書概要(和文)」より2001

    • Author(s)
      吉川左紀子
    • Journal Title

      失語症研究 21

      Pages: 103-112

  • [Journal Article] Analysis of free-response data for dynamic facial expressions : The effects of emtion category and velocity.2001

    • Author(s)
      Sakiko Yoshikawa, Wataru Sato
    • Journal Title

      Kyoto University Research Studies in Education 47

      Pages: 51-68

    • Description
      「研究成果報告書概要(欧文)」より
  • [Journal Article] Neuro-cognitive processes of social attention and emotion : Interaction between gaze and facial expression.2001

    • Author(s)
      Sakiko Yoshikawa
    • Journal Title

      Higher Brain Function Research 21

      Pages: 103-112

    • Description
      「研究成果報告書概要(欧文)」より

URL: 

Published: 2006-07-11  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi