2022 Fiscal Year Annual Research Report
Scaling up CNN computations for data-intensive scientific applications
Project/Area Number |
20K19823
|
Research Institution | Kobe University |
Principal Investigator |
|
Project Period (FY) |
2020-04-01 – 2023-03-31
|
Keywords | Deep Learning 4 science / Computational Efficiency / Computer Vision / ConvNets |
Outline of Annual Research Achievements |
In recent years, deep learning has become a vital technology for various fields, but its computationally intensive nature limits its accessibility. This research has aimed to develop novel computational tools to improve the efficiency and performance of deep learning models, reducing their computational burden. The effectiveness of these algorithms was tested on applications in neuroscience, biodiversity monitoring, and material science. The research resulted in tangible improvements in terms of both memory consumption, computational and algorithmic efficiency on the technical side and interdisciplinary collaboration on the organizational side. In addition to our original application targets, the methods we developed have been applied in prototyping low-level vision models for both industry and academia and improving hydrological models for water resource management. Meanwhile, the advent of the "foundation models" era during the course of this project has seen ever larger models proposed. The rate of state-of-the-art model size increase has been much faster than computational progress, which further highlights the need for efficient computation in order to democratize access to these technologies.
|