2020 Fiscal Year Annual Research Report
Exploring Deep Neural Networks for Temporal Information Extraction
Publicly Offered Research
Project Area | Chronogenesis: how the mind generates time |
Project/Area Number |
19H05318
|
Research Institution | Kyoto University |
Principal Investigator |
程 飛 京都大学, 情報学研究科, 特定助教 (70801570)
|
Project Period (FY) |
2019-04-01 – 2021-03-31
|
Keywords | Information extraction / Neural networks / Language processing |
Outline of Annual Research Achievements |
According to our research plan, we developed neural network models to improve state-of-the-art temporal information extraction (TRC) performance. We explored three valid research directions: 1) Neural network classification models can be improved in several aspects. Deeper network layers, attention mechanism and integrating more features are proved to be effective in many tasks. We first introduced latest deep layer transfer learning models (i.e. BERT) into the TRC tasks. The empirical results showed the significant improvements over the existing shallow neural models. 2) Existing research of TRC treats E-E, E-T, and E-D as three separate TLINK classification targets. We experimented training all three TLINK types in a multi-task learning scenario, which effectively exploy the use of all the data and significantly improve the baseline. 3) ‘Multilingual’ is an important topic in NLP. Many high-quality Japanese TRC corpora have been constructed in recent years, such as BCCWJ-Timebank. We performed the additional experiments to evaluate the feasibility of our models for the Japanese BCCWJ-Timebank. Our model outperforms the existing SOTA models by a large margin. In summary, we manage to develop the SOTA performance TRC models. The paper is accepted by the top international conference for natural language processing: EMNLP (findings volume). We also explored 'end-to-end information extraction' and 'temporal question-answering' tasks, which is helpful to understanding the TRC task from different aspects. These two papers are accepted by EMNLP-findings and ACL-RepL4NLP.
|
Research Progress Status |
令和2年度が最終年度であるため、記入しない。
|
Strategy for Future Research Activity |
令和2年度が最終年度であるため、記入しない。
|
Remarks |
The codes of our two EMNLP-findings papers are publicly released via github.
|