• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to project page

2020 Fiscal Year Final Research Report

Self-explainable and fast-to-train example-based machine translation using neural networks

Research Project

  • PDF
Project/Area Number 18K11447
Research Category

Grant-in-Aid for Scientific Research (C)

Allocation TypeMulti-year Fund
Section一般
Review Section Basic Section 61030:Intelligent informatics-related
Research InstitutionWaseda University

Principal Investigator

LEPAGE YVES  早稲田大学, 理工学術院(情報生産システム研究科・センター), 教授 (70573608)

Project Period (FY) 2018-04-01 – 2021-03-31
Keywords自然言語処理 / 機械翻訳
Outline of Final Research Achievements

This research introduced self-explanation in example-based machine translation (EBMT) by analogy. It is thus positioned in explainable artificial intelligence (XAI). Self-explanation was implemented by tracing the analogies verified or solved during translation. The direct and indirect approaches to EBMT by analogy were merged in system that uses an original neural network. It was studied how to retrieve sentences that cover a given sentence semantically and formally was built. It was studied how dense corpora are relative to analogies. Datasets of analogies between sentences were released.

Free Research Field

自然言語処理

Academic Significance and Societal Importance of the Research Achievements

統計的機械翻訳(SMT)の手法である文部分アライメントと、ニューラル自然言語処理(NMT)の手法である単語や文のベクトル表現を用いて、類推関係方程式の解を求める手法を改善した。類推関係に基づいた用例機械翻訳の直接的なアプローチと間接的なアプローチを融合し、独自のニューラルネットワークを用いたシステムを構築した。入力は、単語のベクトル表現に基づく、単言語または対言語のソフトアライメントです。

URL: 

Published: 2022-01-27  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi