Project/Area Number |
20H04250
|
Research Category |
Grant-in-Aid for Scientific Research (B)
|
Allocation Type | Single-year Grants |
Section | 一般 |
Review Section |
Basic Section 61030:Intelligent informatics-related
|
Research Institution | Institute of Physical and Chemical Research |
Principal Investigator |
Ha Quang Minh 国立研究開発法人理化学研究所, 革新知能統合研究センター, ユニットリーダー (90868928)
|
Project Period (FY) |
2020-04-01 – 2023-03-31
|
Project Status |
Completed (Fiscal Year 2022)
|
Budget Amount *help |
¥6,240,000 (Direct Cost: ¥4,800,000、Indirect Cost: ¥1,440,000)
Fiscal Year 2022: ¥1,690,000 (Direct Cost: ¥1,300,000、Indirect Cost: ¥390,000)
Fiscal Year 2021: ¥1,690,000 (Direct Cost: ¥1,300,000、Indirect Cost: ¥390,000)
Fiscal Year 2020: ¥2,860,000 (Direct Cost: ¥2,200,000、Indirect Cost: ¥660,000)
|
Keywords | Gaussian measures / Gaussian processes / Optimal Transport / Information Geometry / Riemannian geometry / Divergences / Wasserstein distance / Entropic regularization / Optimal transport / Functional data analysis / covariance operators / optimal transport / Information geometry / Fisher-Rao metric / Hilbert manifold / Gaussian process / RKHS / Riemannian manifolds / Covariance operators / Hilbert space / Kernel methods |
Outline of Research at the Start |
Geometrical methods have become increasingly important in machine learning and statistics. This project aims to develop novel Machine Learning and Statistical methods on Infinite-Dimensional Manifolds, with practical applications in Computer Vision, Signal Processing, and Brain Computer Interfaces.
|
Outline of Final Research Achievements |
We have obtained many results on the geometry of infinite-dimensional Gaussian measures, Gaussian processes, and infinite-dimensional positive definite operators in the framework of Optimal Transport and Information Geometry. These include (1) Explicit mathematical formulas for many quantities of interest involved, including entropic regularized Wasserstein distance, regularized Kullback-Leibler and Renyi divergences, and regularized Fisher-Rao distance. These can readily be employed in algorithms in machine learning and statistics. (2) Extensive theoretical analysis showing in particular dimension-independent sample complexities of the above regularized distances and divergences. These provide guarantees for the consistency of finite-dimensional methods to approximate them in practice. Moreover, we show explicitly that the regularized distances and divergences possess many favorable theoretical properties over exact ones, with implications for practical algorithms.
|
Academic Significance and Societal Importance of the Research Achievements |
Our results are the first in the setting of infinite-dimensional Gaussian measures and Gaussian processes. They (1) elucidate many theoretical properties of Optimal Transport; (2) have important consequences for the mathematical foundations of Gaussian process methods in machine learning.
|