2022 Fiscal Year Annual Research Report
Temporal knowledge supervision for pre-training tranfer learning models
Publicly Offered Research
Project Area | Chronogenesis: how the mind generates time |
Project/Area Number |
21H00308
|
Research Institution | Kyoto University |
Principal Investigator |
程 飛 京都大学, 情報学研究科, 特定助教 (70801570)
|
Project Period (FY) |
2021-04-01 – 2023-03-31
|
Keywords | Temporal Reasoning / Commonsense Reasoning / Large Language Model / Weak Supervision |
Outline of Annual Research Achievements |
Large language models (LLMs) often lack the ability to reason numerical and temporal knowledge such as how long an event lasts, how frequent it is, etc. Enhancing off-the-shelf LLMs' reasoning capability with large-scale weak supervision becomes a crucial topic. We relieved the reliance on human annotation and propose a bimodal voting strategy to obtain high-quality semi temporal knowledge. We re-train off-the-shelf LLMs on semi-supervision and observe significant improvement in temporal commonsense reasoning. We also explore a novel approach for identifying semantic relations (including temporal relations) between two events by revealing the labels of the most similar training examples. We have several papers accepted by the top AI conferences (EMNLP, EACL) and domestic conferences.
|
Research Progress Status |
令和4年度が最終年度であるため、記入しない。
|
Strategy for Future Research Activity |
令和4年度が最終年度であるため、記入しない。
|