2022 Fiscal Year Final Research Report
Model Switching Criteria in Dynamic Model Learning of Neural Networks
Project/Area Number |
19K12151
|
Research Category |
Grant-in-Aid for Scientific Research (C)
|
Allocation Type | Multi-year Fund |
Section | 一般 |
Review Section |
Basic Section 61040:Soft computing-related
|
Research Institution | University of Tsukuba |
Principal Investigator |
|
Project Period (FY) |
2019-04-01 – 2023-03-31
|
Keywords | ニューラルネットワーク / 動的モデル学習 / 転移学習 / 蒸留 |
Outline of Final Research Achievements |
This project concerns the “dynamical model training” of neural networks which involves a training process that switches multiple models and maps implemented during a single learning process. It aimed to clarify the methods and conditions for obtaining a superior map in the desired model in an efficient manner. In model compression using dynamical model training in convolutional neural networks (CNNs) for image classification, methods using multi-stage switching and map selection by distillation with selective transfer of meritorious nature have been proposed. There, it was clarified that a gradual change in the model size and merit-based distillation contribute to the improvement of the performances of the compressed networks. In addition, a method for modality selection in image classification using multimodal features was proposed, and its effectiveness was shown.
|
Free Research Field |
パターン認識
|
Academic Significance and Societal Importance of the Research Achievements |
本研究は,パターン認識系の構築における蒸留や転移学習などに見られる複数のニューラルネットワークモデルを経由した学習過程について,それらに共通するモデル切り替え時のモデルと写像の選択方法について学習効率と認識精度の観点から検討を加え,新たな学習方式を提案して学習の効率化と認識精度の向上が可能となることを示しており,単一のモデルに限定されない学習方式の実用化に貢献する成果である.このことは,一定の能力を持つパターン認識系を従来より計算リソースの限られたデバイス上に実現することを可能にするもので,タブレット端末やIoTデバイス等への高度なパターン認識系の実装に資するものである.
|