• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to project page

2014 Fiscal Year Final Research Report

Pronunciation gestures based on articulatory feature extracted from Speech

Research Project

  • PDF
Project/Area Number 24720254
Research Category

Grant-in-Aid for Young Scientists (B)

Allocation TypeMulti-year Fund
Research Field Foreign language education
Research InstitutionAichi Prefectural University (2013-2014)
Toyohashi University of Technology (2012)

Principal Investigator

IRIBE Yurie  愛知県立大学, 情報科学部, 助教 (40397500)

Project Period (FY) 2012-04-01 – 2015-03-31
Keywords発音訓練 / 外国語教育 / 調音運動
Outline of Final Research Achievements

We describe computer-assisted pronunciation training (CAPT) through the visualization of the articulatory gestures from learner’s speech in this paper. Typical CAPT systems cannot indicate how the learner can correct his/her articulation. The proposed system enables the learner to study how to correct their pronunciation by comparing the wrongly pronounced gesture with a correctly pronounced gesture. In this system, a multi-layer neural network (MLN) is used to convert the learner’s speech into the coordinates for a vocal tract using Magnetic Resonance Imaging data. Then, an animation is generated using the values of the vocal tract coordinates. Moreover, we improved the animations by introducing an anchor-point for a phoneme to MLN training. The new system could even generate accurate CG animations from the English speech by Japanese people in the experiment.

Free Research Field

音声情報処理,ユーザインタフェース

URL: 

Published: 2016-06-03  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi