• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to previous page

Machine learning and statistical methhods on infinite-dimensional manifolds

Research Project

Project/Area Number 20H04250
Research Category

Grant-in-Aid for Scientific Research (B)

Allocation TypeSingle-year Grants
Section一般
Review Section Basic Section 61030:Intelligent informatics-related
Research InstitutionInstitute of Physical and Chemical Research

Principal Investigator

Ha Quang Minh  国立研究開発法人理化学研究所, 革新知能統合研究センター, ユニットリーダー (90868928)

Project Period (FY) 2020-04-01 – 2023-03-31
Project Status Completed (Fiscal Year 2022)
Budget Amount *help
¥6,240,000 (Direct Cost: ¥4,800,000、Indirect Cost: ¥1,440,000)
Fiscal Year 2022: ¥1,690,000 (Direct Cost: ¥1,300,000、Indirect Cost: ¥390,000)
Fiscal Year 2021: ¥1,690,000 (Direct Cost: ¥1,300,000、Indirect Cost: ¥390,000)
Fiscal Year 2020: ¥2,860,000 (Direct Cost: ¥2,200,000、Indirect Cost: ¥660,000)
KeywordsGaussian measures / Gaussian processes / Optimal Transport / Information Geometry / Riemannian geometry / Divergences / Wasserstein distance / Entropic regularization / Optimal transport / Functional data analysis / covariance operators / optimal transport / Information geometry / Fisher-Rao metric / Hilbert manifold / Gaussian process / RKHS / Riemannian manifolds / Covariance operators / Hilbert space / Kernel methods
Outline of Research at the Start

Geometrical methods have become increasingly important in machine learning and statistics. This project aims to develop novel Machine Learning and Statistical methods on Infinite-Dimensional Manifolds, with practical applications in Computer Vision, Signal Processing, and Brain Computer Interfaces.

Outline of Final Research Achievements

We have obtained many results on the geometry of infinite-dimensional Gaussian measures, Gaussian processes, and infinite-dimensional positive definite operators in the framework of Optimal Transport and Information Geometry. These include
(1) Explicit mathematical formulas for many quantities of interest involved, including entropic regularized Wasserstein distance, regularized Kullback-Leibler and Renyi divergences, and regularized Fisher-Rao distance. These can readily be employed in algorithms in machine learning and statistics.
(2) Extensive theoretical analysis showing in particular dimension-independent sample complexities of the above regularized distances and divergences. These provide guarantees for the consistency of finite-dimensional methods to approximate them in practice. Moreover, we show explicitly that the regularized distances and divergences possess many favorable theoretical properties over exact ones, with implications for practical algorithms.

Academic Significance and Societal Importance of the Research Achievements

Our results are the first in the setting of infinite-dimensional Gaussian measures and Gaussian processes. They (1) elucidate many theoretical properties of Optimal Transport; (2) have important consequences for the mathematical foundations of Gaussian process methods in machine learning.

Report

(4 results)
  • 2022 Annual Research Report   Final Research Report ( PDF )
  • 2021 Annual Research Report
  • 2020 Annual Research Report
  • Research Products

    (7 results)

All 2023 2022 2021

All Journal Article (3 results) (of which Int'l Joint Research: 3 results,  Peer Reviewed: 3 results) Presentation (4 results) (of which Int'l Joint Research: 3 results,  Invited: 3 results)

  • [Journal Article] Convergence and finite sample approximations of entropic regularized Wasserstein distances in Gaussian and RKHS settings2023

    • Author(s)
      Ha Quang Minh
    • Journal Title

      Analysis and Applications

      Volume: 21 Issue: 03 Pages: 719-775

    • DOI

      10.1142/s0219530522500142

    • Related Report
      2022 Annual Research Report
    • Peer Reviewed / Int'l Joint Research
  • [Journal Article] Finite Sample Approximations of Exact and Entropic Wasserstein Distances Between Covariance Operators and Gaussian Processes2022

    • Author(s)
      Ha Quang Minh
    • Journal Title

      SIAM/ASA Journal on Uncertainty Quantification

      Volume: 10 Issue: 1 Pages: 96-124

    • DOI

      10.1137/21m1410488

    • Related Report
      2021 Annual Research Report
    • Peer Reviewed / Int'l Joint Research
  • [Journal Article] Entropic Regularization of Wasserstein Distance Between Infinite-Dimensional Gaussian Measures and Gaussian Processes2022

    • Author(s)
      Ha Quang Minh
    • Journal Title

      Journal of Theoretical Probability

      Volume: - Issue: 1 Pages: 201-296

    • DOI

      10.1007/s10959-022-01165-1

    • Related Report
      2021 Annual Research Report
    • Peer Reviewed / Int'l Joint Research
  • [Presentation] Fisher-Rao Riemannian geometry of equivalent Gaussian measures on Hilbert space2023

    • Author(s)
      Ha Quang Minh
    • Organizer
      6th International Conference on Geometric Science of Information
    • Related Report
      2020 Annual Research Report
    • Int'l Joint Research
  • [Presentation] Renyi divergences in RKHS and Gaussian process settings2022

    • Author(s)
      Ha Quang Minh
    • Organizer
      International Conference on Information Geometry for Data Science
    • Related Report
      2022 Annual Research Report
    • Invited
  • [Presentation] Riemannian distances between infinite-dimensional covariance operators and Gaussian processes2021

    • Author(s)
      Ha Quang Minh
    • Organizer
      4th International Conference on Econometrics and Statistics
    • Related Report
      2021 Annual Research Report
    • Int'l Joint Research / Invited
  • [Presentation] Regularized information geometric and optimal transport distances between covariance operators and Gaussian processes2021

    • Author(s)
      Ha Quang Minh
    • Organizer
      Conference on Mathematics of Machine Learning
    • Related Report
      2021 Annual Research Report
    • Int'l Joint Research / Invited

URL: 

Published: 2020-04-28   Modified: 2024-01-30  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi