2021 Fiscal Year Final Research Report
Semi-supervised all-words WSD by co-training of forward LSTM and backward LSTM
Project/Area Number |
19K12093
|
Research Category |
Grant-in-Aid for Scientific Research (C)
|
Allocation Type | Multi-year Fund |
Section | 一般 |
Review Section |
Basic Section 61030:Intelligent informatics-related
|
Research Institution | Ibaraki University |
Principal Investigator |
Shinnou Hiroyuki 茨城大学, 理工学研究科(工学野), 教授 (10250987)
|
Project Period (FY) |
2019-04-01 – 2022-03-31
|
Keywords | all-words WSD / 半教師あり学習 / Co-training / BERT |
Outline of Final Research Achievements |
In general, a word has multiple senses (meanings). The all-words WSD is a task in which each word in an input sentence is assigned a sense in that sentence. Since this task can be solved by using BERT, the successor model of LSTM, we investigated the use of BERT and showed how to apply BERT to various tasks, including this task. We showed how to apply BERT to various tasks, including this task.
|
Free Research Field |
自然言語処理
|
Academic Significance and Societal Importance of the Research Achievements |
自然言語処理の各種タスクは機械学習を利用することで解決できる.しかし機械学習で必要とされる訓練データの構築コストが大きいという問題がある.本研究のタスクの all-words WSD はその問題が特に顕著である.BERT は事前学習済みモデルであり,BERT を利用することで少量の訓練データから高精度のモデルを学習できる.研究課題の含め,各種タスクに BERT の利用する方法を示すことができた.
|