研究課題/領域番号 |
21K17814
|
研究機関 | 国立研究開発法人理化学研究所 |
研究代表者 |
HEINZERLING BENJAMIN 国立研究開発法人理化学研究所, 革新知能統合研究センター, 研究員 (50846491)
|
研究期間 (年度) |
2021-04-01 – 2024-03-31
|
キーワード | language model / knowledge base / world knowledge / grounding |
研究実績の概要 |
In the second year of the grant period we devised, implemented, and evaluated a neural network model architecture for combining symbolic information from a knowledge base with non-symbolic representations of textual information.
The model architecture consists of two multi-layered encoder stacks, one for symbolic information and one for textual information. The two encoder stacks interact at arbitrary layers via cross-attention and gates that determine how much one encoder use the information of the encoder to update its internal representations.
Evaluation on distantly-supervised relation extraction benchmarks demonstrated state-of-the-art performance. The work was published at EMNLP 2022, as well as domestically at NLP 2023 where it was recognized as an outstanding paper.
|
現在までの達成度 (区分) |
現在までの達成度 (区分)
3: やや遅れている
理由
Some of the planned experiments could not be conducted since the primary computing resources (ABCI) becoming unexpectedly unavailable due budgeting constraints.
|
今後の研究の推進方策 |
The rapid development of large language models and generative AI requires adjusting our planned model design, implementation and evaluation.
|
次年度使用額が生じた理由 |
Budget could not be used as planned for ABCI cluster fees ("ABCI points") since ABCI stopped selling points earlier than expected.
|