Study on Human-Machine Interface utilizing eye and lip movement
Grant-in-Aid for General Scientific Research (B)
|Allocation Type||Single-year Grants|
|Research Institution||Kanazawa Institute of Technology|
KAKITA Yuki Kanazawa Institute of Technology, Faculty of Engineering, Professor, 工学部, 教授 (00098823)
|Project Period (FY)
1991 – 1993
Completed(Fiscal Year 1993)
|Budget Amount *help
¥3,700,000 (Direct Cost : ¥3,700,000)
Fiscal Year 1993 : ¥600,000 (Direct Cost : ¥600,000)
Fiscal Year 1992 : ¥600,000 (Direct Cost : ¥600,000)
Fiscal Year 1991 : ¥2,500,000 (Direct Cost : ¥2,500,000)
|Keywords||Human Interface / Communication / Eye movement / Lip movement / Image Processing / Face posture / Language / Speech / ヒュ-マンインタフェ-ス / コミュニケ-ション / 眠の動き|
Feature extraction system development of lip and eye movements are proposed as a human-machine interface. To apply for the multi-media communication even for the handicapped, feature extraction from the face posture is needed. We already developed the hardware/software systems for image processing and feature extraction from the face, particularly lips and eyes, from ordinary videotaped images. A part of the fund was granted by Monbusho (#63460129) for our previous research project.
In this project, the primary concern was to extract features from images to apply for the verbal and/or non-verbal communication aids.
(1) eye movements
Experiments of detecting the eye pointing object were conducted. The image system estimate the direction of eye sight by the image of iris of one eye when looking at the keyboard-like objects configured in matrix of alphabet characters. New software algorithm to compute the direction of eye sight was developed. A real-time object detection system were developed even under the head free condition. A videotape for the demonstration was presented at the defense of the MS thesis at KIT. The device and methods were promising to be applied for the handicapped.
(2) lip movements
Production mechanism of bilabial stops was modeled. A conceptual system for automatic production of the shapes and movements of lips for speech purpose from the input of muscle contraction information. Relation between the shapes/movements of lips and the feature extraction in auditory perceptual process was examined experimentally. A number of features of lip shapes and movements were found to help recognize the sound feature of speech. These findings will be useful for the lip-reading system.
Research Output (26results)