• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to previous page

潜在的情報の推定モデルに基づく多言語処理の研究

Research Project

Project/Area Number 15J12597
Research Category

Grant-in-Aid for JSPS Fellows

Allocation TypeSingle-year Grants
Section国内
Research Field Intelligent informatics
Research InstitutionThe University of Tokyo

Principal Investigator

江里口 瑛子  東京大学, 工学系研究科, 特別研究員(DC1)

Project Period (FY) 2015-04-24 – 2018-03-31
Project Status Completed (Fiscal Year 2017)
Budget Amount *help
¥2,500,000 (Direct Cost: ¥2,500,000)
Fiscal Year 2017: ¥800,000 (Direct Cost: ¥800,000)
Fiscal Year 2016: ¥800,000 (Direct Cost: ¥800,000)
Fiscal Year 2015: ¥900,000 (Direct Cost: ¥900,000)
Keywords機械翻訳 / 自然言語処理 / 機械学習
Outline of Annual Research Achievements

本年度は、前年度に提案した翻訳先言語における構文情報を活用したニューラル機械翻訳モデルの分析を行なった。これまでの評価方法では、提案モデルを用いて翻訳文の生成を行ない、自動評価指標を用いて翻訳性能を評価してきた。これに加えて、翻訳文の構文木構造の同時推定を行い、提案モデルが推定した構文木構造情報についても分析を行なった。構文木構造情報としては係り受け構文構造に着目し、実際に単語間の係り受け関係がやや誤りは含むものの、提案モデルにより推定が可能であることを確認した。得られた研究成果は論文にまとめ、国際会議にて発表を行なった。また、前前年度の翻訳元言語側の構文木構造を他の翻訳言語対タスクに適用し、更なる分析を実施した。得られた研究成果は論文にまとめ、現在投稿中である。以下にそれぞれの研究実績の概要を挙げる。
(1) 翻訳先言語における構文情報の導入: 前年度に提案した、翻訳先言語側の構文木情報を学習するモデルについて追加の実験を行なった。また、得られた翻訳結果の有する構文木について提案モデルで推定を行ない、翻訳文生成ならびに生成結果の構文解析が達成できていることを確認した。得られた研究成果は、言語処理分野の最高峰国際会議 ACLにて受理され、2017年8月に口頭発表を行った (採択率22%)。本研究は、アメリカ合衆国ニューヨーク大学との共同研究成果である。
(2) 前前年度に提案した、翻訳元言語側で得られた構文木構造を利用するニューラル機械翻訳モデルに関する更なる分析を行なった。翻訳対象言語として新たに中日翻訳タスクへ適用し、翻訳元言語側で利用した構文木には句構造情報を用いた。ここでは日中翻訳において、中国語の句の情報あるいは単語の情報のどちらが日本語への翻訳の際に有用であるかなど分析を行った。

Research Progress Status

29年度が最終年度であるため、記入しない。

Strategy for Future Research Activity

29年度が最終年度であるため、記入しない。

Report

(3 results)
  • 2017 Annual Research Report
  • 2016 Annual Research Report
  • 2015 Annual Research Report
  • Research Products

    (27 results)

All 2018 2017 2016 Other

All Int'l Joint Research (3 results) Journal Article (8 results) (of which Int'l Joint Research: 4 results,  Peer Reviewed: 5 results,  Open Access: 7 results,  Acknowledgement Compliant: 3 results) Presentation (12 results) (of which Int'l Joint Research: 7 results,  Invited: 2 results) Remarks (4 results)

  • [Int'l Joint Research] ニューヨーク大学(米国)

    • Related Report
      2017 Annual Research Report
  • [Int'l Joint Research] 清華大学(中国)

    • Related Report
      2017 Annual Research Report
  • [Int'l Joint Research] ニューヨーク大学(米国)

    • Related Report
      2016 Annual Research Report
  • [Journal Article] Gumbel Samplingによる敵対性ニューラル機械翻訳2018

    • Author(s)
      白井圭佑, 江里口瑛子, 橋本和真, 森信介, 二宮崇
    • Journal Title

      言語処理学会 第24回年次大会 (NLP 2018)

      Volume: -- Pages: 296-299

    • Related Report
      2017 Annual Research Report
  • [Journal Article] Learning to Parse and Translate Improves Neural Machine Translation2017

    • Author(s)
      Akiko Eriguchi, Yoshimasa Tsuruoka, and Kyunghyun Cho
    • Journal Title

      The 55th Annual Meeting of the Association for Computational Linguistics (ACL 2017)

      Volume: 2 Pages: 72-78

    • Related Report
      2017 Annual Research Report
    • Peer Reviewed / Open Access / Int'l Joint Research
  • [Journal Article] Cache friendly parallelization of neural encoder-decoder models without padding on multi-core architecture.2017

    • Author(s)
      Yuchen Qiao, Kazuma Hashimoto, Akiko Eriguchi, Haixia Wang, Dongsheng Wang, Yoshimasa Tsuruoka, and Kenjiro Taura.
    • Journal Title

      The 6th International Workshop on Parallel and Distributed Computing for Large Scale Machine Learning and Big Data Analytics

      Volume: - Pages: 437-440

    • DOI

      10.1109/ipdpsw.2017.165

    • Related Report
      2017 Annual Research Report
    • Peer Reviewed / Open Access / Int'l Joint Research
  • [Journal Article] Cache Friendly Parallelization of Neural Encoder-Decoder Models without Padding on Multi-core Architecture2017

    • Author(s)
      Yuchen Qiao, Kazuma Hashimoto, Akiko Eriguchi, Haxia Wang, Dongsheng Wang, Yoshimasa Tsuruoka, and Kenjiro Taura
    • Journal Title

      Proceedings of the 6th International Workshop on Parallel and Distributed Computing for Large Scale Machine Learning and Big Data Analytics

      Volume: 印刷中

    • Related Report
      2016 Annual Research Report
    • Peer Reviewed / Open Access / Int'l Joint Research
  • [Journal Article] Learning to Parse and Translate Improves Neural Machine Translation2017

    • Author(s)
      Akiko Eriguchi, Yoshimasa Tsuruoka, and Kyunghyun Cho
    • Journal Title

      Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics

      Volume: 印刷中

    • Related Report
      2016 Annual Research Report
    • Peer Reviewed / Open Access / Int'l Joint Research / Acknowledgement Compliant
  • [Journal Article] Tree-to-Sequence Attentional Neural Machine Translation2016

    • Author(s)
      Akiko Eriguchi, Kazuma Hashimoto, and Yoshimasa Tsuruoka
    • Journal Title

      Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics

      Volume: 1

    • Related Report
      2016 Annual Research Report
    • Peer Reviewed / Open Access / Acknowledgement Compliant
  • [Journal Article] Character-based Decoding in Tree-to-Sequence Attention-based Neural Machine Translation2016

    • Author(s)
      Akiko Eriguchi, Kazuma Hashimoto, and Yoshimasa Tsuruoka
    • Journal Title

      Proceedings of the 3rd Workshop on Asian Translation (WAT2016)

      Volume: -

    • Related Report
      2016 Annual Research Report
    • Open Access / Acknowledgement Compliant
  • [Journal Article] Domain Adaptation and Attention-Based Unknown Word Replacement in Chinese-to-Japanese Neural Machine Translation2016

    • Author(s)
      Kazuma Hashimoto, Akiko Eriguchi, and Yoshimasa Tsuruoka
    • Journal Title

      Proceedings of the 3rd Workshop on Asian Translation (WAT2016)

      Volume: -

    • Related Report
      2016 Annual Research Report
    • Open Access
  • [Presentation] Gumbel Samplingによる敵対性ニューラル機械翻訳2018

    • Author(s)
      白井圭佑, 江里口瑛子, 橋本和真, 森信介, 二宮崇
    • Organizer
      言語処理学会 第24回年次大会 (NLP 2018)
    • Related Report
      2017 Annual Research Report
  • [Presentation] Learning to Parse and Translate Improves Neural Machine Translation2017

    • Author(s)
      Akiko Eriguchi, Yoshimasa Tsuruoka, and Kyunghyun Cho
    • Organizer
      The 55th Annual Meeting of the Association for Computational Linguistics
    • Place of Presentation
      Vancouver, Canada
    • Year and Date
      2017-07-30
    • Related Report
      2016 Annual Research Report
    • Int'l Joint Research
  • [Presentation] 目的言語側における係り受け構造を考慮したニューラル機械翻訳2017

    • Author(s)
      江里口瑛子
    • Organizer
      言語処理学会 第23回年次大会
    • Place of Presentation
      筑波大学, 茨城県つくば市
    • Year and Date
      2017-03-13
    • Related Report
      2016 Annual Research Report
  • [Presentation] Learning to Parse and Translate Improves Neural Machine Translation2017

    • Author(s)
      Akiko Eriguchi, Yoshimasa Tsuruoka, and Kyunghyun Cho
    • Organizer
      The 55th Annual Meeting of the Association for Computational Linguistics (ACL 2017)
    • Related Report
      2017 Annual Research Report
    • Int'l Joint Research
  • [Presentation] Cache Friendly Parallelization of Neural Encoder-Decoder Models without Padding on Multi-core Architecture2017

    • Author(s)
      Yuchen Qiao, Kazuma Hashimoto, Akiko Eriguchi, Haxia Wang, Dongsheng Wang, Yoshimasa Tsuruoka, and Kenjiro Taura
    • Organizer
      The 6th International Workshop on Parallel and Distributed Computing for Large Scale Machine Learning and Big Data Analytics (ParLearning 2017)
    • Related Report
      2017 Annual Research Report
    • Int'l Joint Research
  • [Presentation] Cache Friendly Parallelization of Neural Encoder-Decoder Models without Padding on Multi-core Architecture2017

    • Author(s)
      Yuchen Qiao
    • Organizer
      The 6th International Workshop on Parallel and Distributed Computing for Large Scale Machine Learning and Big Data Analytics
    • Place of Presentation
      Orlando, Florida, USA
    • Related Report
      2016 Annual Research Report
    • Int'l Joint Research
  • [Presentation] 翻訳元言語における構文構造を利用したニューラル機械翻訳2017

    • Author(s)
      江里口瑛子
    • Organizer
      第7回 AAMT/Japio特許翻訳研究会
    • Place of Presentation
      キャンパス・イノベーションセンター東京, 東京
    • Related Report
      2016 Annual Research Report
    • Invited
  • [Presentation] Tree-to-Sequence Attentional Neural Machine Translation2016

    • Author(s)
      Akiko Eriguchi
    • Organizer
      The 54th Annual Meeting of the Association for Computational Linguistics
    • Place of Presentation
      Berlin, Germany
    • Year and Date
      2016-08-07
    • Related Report
      2016 Annual Research Report
    • Int'l Joint Research
  • [Presentation] Character-based Decoding in Tree-to-Sequence Attention-based Neural Machine Translation2016

    • Author(s)
      Akiko Eriguchi
    • Organizer
      The 3rd Workshop on Asian Translation
    • Place of Presentation
      Osaka International Convention Center, Osaka, Japan
    • Related Report
      2016 Annual Research Report
    • Int'l Joint Research
  • [Presentation] Domain Adaptation and Attention-Based Unknown Word Replacement in Chinese-to-Japanese Neural Machine Translation2016

    • Author(s)
      Kazuma Hashimoto
    • Organizer
      The 3rd Workshop on Asian Translation
    • Place of Presentation
      Osaka International Convention Center, Osaka, Japan
    • Related Report
      2016 Annual Research Report
    • Int'l Joint Research
  • [Presentation] 構文情報を利用したニューラルネットワークによる自然言語処理2016

    • Author(s)
      橋本和真, 江里口瑛子
    • Organizer
      第38回 名古屋地区NLPセミナー
    • Place of Presentation
      名古屋大学, 愛知県名古屋市
    • Related Report
      2016 Annual Research Report
    • Invited
  • [Presentation] 句構造へのアテンションに基づくニューラル機械翻訳モデル2016

    • Author(s)
      江里口瑛子, 橋本和真, 鶴岡慶雅
    • Organizer
      言語処理学会 第22回年次大会
    • Place of Presentation
      東北大学, 宮城県仙台市
    • Related Report
      2015 Annual Research Report
  • [Remarks] C++ implementations

    • URL

      https://github.com/tempra28/nmtrnng

    • Related Report
      2017 Annual Research Report
  • [Remarks] Demo: Tree-to-Sequence Attentional NMT

    • URL

      http://www.logos.t.u-tokyo.ac.jp/~eriguchi/demo/tree2seq/index.php

    • Related Report
      2016 Annual Research Report
  • [Remarks] Code: Tree-to-Sequence Attentional NMT

    • URL

      https://github.com/tempra28/tree2seq

    • Related Report
      2016 Annual Research Report
  • [Remarks] Code: NMTRNNG

    • URL

      https://github.com/tempra28/nmtrnng

    • Related Report
      2016 Annual Research Report

URL: 

Published: 2015-11-26   Modified: 2024-03-26  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi