2022 Fiscal Year Research-status Report
Knowledge-Base-Grounded Language Models
Project/Area Number |
21K17814
|
Research Institution | Institute of Physical and Chemical Research |
Principal Investigator |
HEINZERLING BENJAMIN 国立研究開発法人理化学研究所, 革新知能統合研究センター, 研究員 (50846491)
|
Project Period (FY) |
2021-04-01 – 2024-03-31
|
Keywords | language model / knowledge base / world knowledge / grounding |
Outline of Annual Research Achievements |
In the second year of the grant period we devised, implemented, and evaluated a neural network model architecture for combining symbolic information from a knowledge base with non-symbolic representations of textual information.
The model architecture consists of two multi-layered encoder stacks, one for symbolic information and one for textual information. The two encoder stacks interact at arbitrary layers via cross-attention and gates that determine how much one encoder use the information of the encoder to update its internal representations.
Evaluation on distantly-supervised relation extraction benchmarks demonstrated state-of-the-art performance. The work was published at EMNLP 2022, as well as domestically at NLP 2023 where it was recognized as an outstanding paper.
|
Current Status of Research Progress |
Current Status of Research Progress
3: Progress in research has been slightly delayed.
Reason
Some of the planned experiments could not be conducted since the primary computing resources (ABCI) becoming unexpectedly unavailable due budgeting constraints.
|
Strategy for Future Research Activity |
The rapid development of large language models and generative AI requires adjusting our planned model design, implementation and evaluation.
|
Causes of Carryover |
Budget could not be used as planned for ABCI cluster fees ("ABCI points") since ABCI stopped selling points earlier than expected.
|