2020 Fiscal Year Annual Research Report
Unsupervised Neural Machine Translation in Universal Scenarios
Project/Area Number |
19K20354
|
Research Institution | National Institute of Information and Communications Technology |
Principal Investigator |
Wang Rui 国立研究開発法人情報通信研究機構, 先進的音声翻訳研究開発推進センター先進的翻訳技術研究室, 研究員 (00837635)
|
Project Period (FY) |
2019-04-01 – 2021-03-31
|
Keywords | Machine Translation / Unsupervised Learning / NLP |
Outline of Annual Research Achievements |
I have proposed a universal unsupervised approach which train the translation model without using any parallel data. Compared with the existing unsupervised neural machine translation (UNMT) methods, which has only been applied to similar or rich-resource language pairs, my methods can be adapted to universal scenarios. I have published more than 20 peer-reviewed research papers (I am the corresponding authors of most of these papers). Most of these papers are published in the top-tier conferences and journal. Such as 7 ACL, 1 EMNLP, 2 AAAI, 2 ICLR, and 3 IEEE/ACM transactions. I won several first places in top-tier MT/NLP shared tasks, such as WMT-2019, WMT-2020, CoNLL-2019, etc.
|