研究実績の概要 |
I have proposed a universal unsupervised approach which train the translation model without using any parallel data. Compared with the existing unsupervised neural machine translation (UNMT) methods, which has only been applied to similar or rich-resource language pairs, my methods can be adapted to universal scenarios. I have published more than 20 peer-reviewed research papers (I am the corresponding authors of most of these papers). Most of these papers are published in the top-tier conferences and journal. Such as 7 ACL, 1 EMNLP, 2 AAAI, 2 ICLR, and 3 IEEE/ACM transactions. I won several first places in top-tier MT/NLP shared tasks, such as WMT-2019, WMT-2020, CoNLL-2019, etc.
|