Study on Control of Facial Expressions of Face Robot and Their Informatic Characterisitics
Project/Area Number |
05452172
|
Research Category |
Grant-in-Aid for General Scientific Research (B)
|
Allocation Type | Single-year Grants |
Research Field |
Intelligent mechanics/Mechanical systems
|
Research Institution | Science University of Tokyo |
Principal Investigator |
HARA Fumio Science University of Tokyo Deaprtment of Mechanical Engineering, Professor, 工学部第一部, 教授 (90084376)
|
Co-Investigator(Kenkyū-buntansha) |
HOSOKAI Hidemi Science University of Tokyo Suwa college, Professor, 教授 (30084396)
|
Project Period (FY) |
1993 – 1994
|
Project Status |
Completed (Fiscal Year 1994)
|
Budget Amount *help |
¥7,400,000 (Direct Cost: ¥7,400,000)
Fiscal Year 1994: ¥1,200,000 (Direct Cost: ¥1,200,000)
Fiscal Year 1993: ¥6,200,000 (Direct Cost: ¥6,200,000)
|
Keywords | Face robot / 6 basic facial expressions / Action units / Real-time control / Dynamic facial expressions / Real-time measurement of motions in eye, eye brow and mouth / Real-time recognition of facial expressions / 動的表情認識 / 基本表情 / 表情制御 / 特徴点 / アクションユニット / 表情正解率 / 静的表情表出 |
Research Abstract |
(1) The face robot consitutes of a head-frame structure, 18 facial muscle FMA actuators, silicone-rubber face skin, eyes, mouth, nose and neck structure and has 18 degree-of-freedom to mimic the movement of human face motion. The size of the face robot is 20% larger tahn that of normal human face. The motion of face components for expressiong 6 basic facial expressions are determined according to the kind and strength of Action Units used in psychology and control algorithm for the facial expressions has been established. (2) The 6 basic facial expressions displayd on the face robot were evaluated by visual test and obtained about 90% correct expression except for "Fear" expression. (3) Small size air-cylinder-piston type actuator was developed for real-time motion of facial expression and 18 of this actuators were implemented into the face robot and control algorithm was established by using a feedback control in each actuator for realizing a real-time motion of facial expressions on the face robot. (4) By using the transputer system and 2 CCD cameras, the real-time measurement of motions in eyes, eye brows and mouth was successfully performed. The time needed for one dynamic facial image measurement was about 100 ms which is sufficently short for our dynamic recognition of facial expressions. (5) The dynamic facial image data obtained by the transputer system were applied to the neural network trained by typical facial expressions and recognition tests were undertaken to evaluate the performance of facial expressions of the face robot and obtained a rather high rate (about 80% correct ratio). (6) Some of future works will be to investigate the interactive characteristics of psychological information exchange between human and face robot by using facial expressions.
|
Report
(3 results)
Research Products
(14 results)