• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to project page

2019 Fiscal Year Annual Research Report

Exploring Deep Neural Networks for Temporal Information Extraction

Publicly Offered Research

Project AreaChronogenesis: how the mind generates time
Project/Area Number 19H05318
Research InstitutionKyoto University

Principal Investigator

程 飛  京都大学, 情報学研究科, 特定助教 (70801570)

Project Period (FY) 2019-04-01 – 2021-03-31
KeywordsInformation Extraction / Temporal Information / Relation Extraction
Outline of Annual Research Achievements

We proposed two work to tackle the temporal relation classification task. 1) we combine pre-trained BERT and syntactic information to improve the classification performance. The results show BERT significantly outperforms the existing recurrent neural networks and syntactic information benefits the model further. 2) we proposed a model adopting global entity and multi-task learning for temporal relation classification. The results show state-of-the-art performance, compared to the existing work.
Our work improves state-of-the-art temporal relation classification performance close to the human annotation agreements. It can potentially benefit a series of down-streaming applications, e.g. extracting event timeline from news articles.

Current Status of Research Progress
Current Status of Research Progress

2: Research has progressed on the whole more than it was originally planned.

Reason

According to the research plan, in th begining of FY2019, we completed the preliminary step of making a survey of temporal information extraction, related work and datasets. We built a GPU server with two high-end Nvidia Titan RTX 24G graphic cards for accelerating the experiment speed.
We have conducted a serious of experiments for our propsed models. The results suggest we obtained state-of-the-art performance against the existing work. We published our work in two domestic conference (言語処理学会2020, 人工知能学会2020). We have written the English paper for submiting the latest internation conference (EMNLP2020).

Strategy for Future Research Activity

1)The current work suggests pre-trained BERT outperforms the existing recurrent neural networks by large margin. We plan to explore more task-specific solutions by incorporating temporal knowledge for the task of temporal relation classification.
2)Temporal information extraction is composed by a pipeline task setting of temporal mention recognition, modality classification and relation extraction. The known drawback of pipeline models is error propagation in the early stages. We plan to intoduce joint models for overcoming this issue.

  • Research Products

    (2 results)

All 2019

All Presentation (2 results)

  • [Presentation] Dependency Enhanced Contextual Representations forJapanese Temporal Relation Classification2019

    • Author(s)
      Chenjing Geng, Lis Kanashiro Pereira, Fei Cheng, Masayuki Asahara, Ichiro Kobayashi
    • Organizer
      言語処理学会2020
  • [Presentation] 依存関係と文脈表現を用いた日本語時間関係識別2019

    • Author(s)
      Chenjing Geng, Fei Cheng, Lis Kanashiro Pereira, Masayuki Asahara, Ichiro Kobayashi
    • Organizer
      人工知能学会2020

URL: 

Published: 2021-01-27  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi