• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to previous page

Efficient training for neural encoder-decoders on a large amount of training data

Research Project

Project/Area Number 18K18119
Research Category

Grant-in-Aid for Early-Career Scientists

Allocation TypeMulti-year Fund
Review Section Basic Section 61030:Intelligent informatics-related
Research InstitutionTokyo Institute of Technology

Principal Investigator

Takase Sho  東京工業大学, 情報理工学院, 助教 (40817483)

Project Period (FY) 2018-04-01 – 2021-03-31
Project Status Completed (Fiscal Year 2020)
Budget Amount *help
¥2,860,000 (Direct Cost: ¥2,200,000、Indirect Cost: ¥660,000)
Fiscal Year 2020: ¥780,000 (Direct Cost: ¥600,000、Indirect Cost: ¥180,000)
Fiscal Year 2019: ¥780,000 (Direct Cost: ¥600,000、Indirect Cost: ¥180,000)
Fiscal Year 2018: ¥1,300,000 (Direct Cost: ¥1,000,000、Indirect Cost: ¥300,000)
Keywords自然言語処理 / ニューラルネットワーク / 機械翻訳 / 言語モデル
Outline of Final Research Achievements

The purpose of this research is to explore a method to update parameters of neural encoder-decoders every time we obtain an additional training data. For this purpose, we have to construct a sophisticated neural encoder-decoder. In this research, I focused on constructing such sophisticated neural encoder-decoders, and proposed several methods for the construction. Research papers on these methods are accepted at EMNLP, NAACL, AAAI, NeurIPS, that are top-tier conferences on Natural Language Processing. Artificial Intelligence, and Machine Learning.

Academic Significance and Societal Importance of the Research Achievements

ニューラルネットワークの導入により、機械翻訳や要約において、計算機が流暢な出力を行えるようになってきている。しかしながら、計算機の出力した翻訳や要約と、人手で作成したものとの一致率はまだ低く、改善の余地があることが伺える。本研究課題での成果は、従来の機械翻訳器や要約器の性能を引き上げるものであり、この手法を導入することにより、より良い出力が得られると期待できる。特に、本研究では、要約タスクにおいて、人手で設定した要約率に応じた出力を可能にする手法を提案しており、これにより、計算機の出力の柔軟性が向上すると考えられる。

Report

(4 results)
  • 2020 Annual Research Report   Final Research Report ( PDF )
  • 2019 Research-status Report
  • 2018 Research-status Report
  • Research Products

    (5 results)

All 2020 2019 2018

All Presentation (5 results) (of which Int'l Joint Research: 4 results)

  • [Presentation] All Word Embeddings from One Embedding2020

    • Author(s)
      Sho Takase, Sosuke Kobayashi
    • Organizer
      Neural Information Processing Systems
    • Related Report
      2020 Annual Research Report
    • Int'l Joint Research
  • [Presentation] Positional Encoding to Control Output Sequence Length2019

    • Author(s)
      Sho Takase, Naoaki Okazaki
    • Organizer
      North American Chapter of the Association for Computational Linguistics
    • Related Report
      2019 Research-status Report
    • Int'l Joint Research
  • [Presentation] Character n-gram Embeddings to Improve RNN Language Models2019

    • Author(s)
      Sho Takase, Jun Suzuki, Masaaki Nagata
    • Organizer
      Thirty-Third AAAI Conference on Artificial Intelligence
    • Related Report
      2018 Research-status Report
    • Int'l Joint Research
  • [Presentation] 位置エンコーディングを用いた出力長制御2019

    • Author(s)
      高瀬翔, 岡崎直観
    • Organizer
      言語処理学会年次大会
    • Related Report
      2018 Research-status Report
  • [Presentation] Direct Output Connection for a High-Rank Language Model2018

    • Author(s)
      Sho Takase, Jun Suzuki, Masaaki Nagata
    • Organizer
      Empirical Methods in Natural Language Processing
    • Related Report
      2018 Research-status Report
    • Int'l Joint Research

URL: 

Published: 2018-04-23   Modified: 2022-01-27  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi