2014 Fiscal Year Final Research Report
Pronunciation gestures based on articulatory feature extracted from Speech
Project/Area Number |
24720254
|
Research Category |
Grant-in-Aid for Young Scientists (B)
|
Allocation Type | Multi-year Fund |
Research Field |
Foreign language education
|
Research Institution | Aichi Prefectural University (2013-2014) Toyohashi University of Technology (2012) |
Principal Investigator |
IRIBE Yurie 愛知県立大学, 情報科学部, 助教 (40397500)
|
Project Period (FY) |
2012-04-01 – 2015-03-31
|
Keywords | 発音訓練 / 外国語教育 / 調音運動 |
Outline of Final Research Achievements |
We describe computer-assisted pronunciation training (CAPT) through the visualization of the articulatory gestures from learner’s speech in this paper. Typical CAPT systems cannot indicate how the learner can correct his/her articulation. The proposed system enables the learner to study how to correct their pronunciation by comparing the wrongly pronounced gesture with a correctly pronounced gesture. In this system, a multi-layer neural network (MLN) is used to convert the learner’s speech into the coordinates for a vocal tract using Magnetic Resonance Imaging data. Then, an animation is generated using the values of the vocal tract coordinates. Moreover, we improved the animations by introducing an anchor-point for a phoneme to MLN training. The new system could even generate accurate CG animations from the English speech by Japanese people in the experiment.
|
Free Research Field |
音声情報処理,ユーザインタフェース
|