2005 Fiscal Year Final Research Report Summary
Research on a Systematic Analysis Method on Individual Identification and Facial Expression Analysis Using Three-dimensional Facial Images
Project/Area Number |
16500103
|
Research Category |
Grant-in-Aid for Scientific Research (C)
|
Allocation Type | Single-year Grants |
Section | 一般 |
Research Field |
Perception information processing/Intelligent robotics
|
Research Institution | University of Toyama |
Principal Investigator |
YONEDA Masaaki University of Toyama, Dept.of Engineering, Professor, 工学部, 教授 (30019210)
|
Project Period (FY) |
2004 – 2005
|
Keywords | face plane / distance image / least variance criterion / least distance criterion / facial expression differential map / facial expression classification / non-contact 3-D digitizer / facial coordinate system |
Research Abstract |
In this research, we define a face plane to a three-dimensional data set of a facial surface and propose a method to capture as two-dimensional data effectively a change of facial expression by using the face plane. We think that such data obtained from a three-dimensional face data set by mapping the three-dimensional face data to a plane located uniquely and stably is more useful for facial expression analysis than the raw three-dimensional face data. The face plane defined in this research is solved uniquely and stably from a three-dimensional face data set and is located at the position in a head where normal lines to a face surface are focused. First, we give two definitions of the face plane and present a derivation method for each definition. Second, we propose a facial expression differential map which visualizes a facial expression change from a normal face to a smiling or wry face as analyzing method for facial expression. Finally, by a recognition experiment of facial expression, we demonstrate the effectiveness of the face plane and the facial expression differential map for facial expression analysis. Furthermore, by using the face plane we propose a new method that corrects inclinations of a facial attitude in a three-dimensional facial image. Such a method is very important as pre-processing in facial expression analysis or individual identification. Our method does not need to recognize facial parts such as eyes, a nose or a mouth in advance. In the experiment on correcting inclinations of a facial attitude, we compare the proposed coordinate system with an ordinary coordinate system obtained by using facial parts, and it is shown that our coordinate system is more effective than the ordinary coordinate system. In the experiment on recognizing a facial expression change, it is shown that our coordinate system is more effective too as pre-processing of a facial expression recognition system than the camera coordinate system.
|
Research Products
(7 results)