• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to project page

2023 Fiscal Year Final Research Report

Knowledge-Base-Grounded Language Models

Research Project

  • PDF
Project/Area Number 21K17814
Research Category

Grant-in-Aid for Early-Career Scientists

Allocation TypeMulti-year Fund
Review Section Basic Section 61030:Intelligent informatics-related
Research InstitutionInstitute of Physical and Chemical Research

Principal Investigator

Heinzerling Benjamin  国立研究開発法人理化学研究所, 革新知能統合研究センター, 副チームリーダー (50846491)

Project Period (FY) 2021-04-01 – 2024-03-31
KeywordsLanguage models / structured knowledge / interpretability / explainability / knowledge representation / knowledge base
Outline of Final Research Achievements

This research project attained two main achievements. The first achievement is a language model (LM) architecture that enables better integration of structured knowledge. While LMs are commonly trained on large amounts of text, it is often desirable to integrate specific, structured knowledge, such as an proprietary in-house knowledge base or other knowledge that is not covered by the LM's training data. Here, we developed a bi-encoder architecture that enables such an integration without requiring costly retraining.
The second achievement is an interpretation method for analyzing how well LMs represent a specific kind of structured knowledge, namely an numeric properties such as a person's year of birth or a city's population.

Free Research Field

Natural Language Processing

Academic Significance and Societal Importance of the Research Achievements

The first achievement provides an efficient method for integrating structured knowledge into existing language models, which allows users to adapt LMs to their specific needs without costly retraining.
The second achievement improves our understanding of how LMs, thereby increasing transparency.

URL: 

Published: 2025-01-30  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi