• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to previous page

Theoretical Analysis of Transfer Learning and Its Applications

Research Project

Project/Area Number 17K12653
Research Category

Grant-in-Aid for Young Scientists (B)

Allocation TypeMulti-year Fund
Research Field Statistical science
Research InstitutionThe University of Tokyo (2020-2022)
Institute of Physical and Chemical Research (2017-2019)

Principal Investigator

Kumagai Wataru  東京大学, 大学院工学系研究科(工学部), 特任助教 (20747167)

Project Period (FY) 2017-04-01 – 2023-03-31
Project Status Completed (Fiscal Year 2022)
Budget Amount *help
¥4,030,000 (Direct Cost: ¥3,100,000、Indirect Cost: ¥930,000)
Fiscal Year 2020: ¥780,000 (Direct Cost: ¥600,000、Indirect Cost: ¥180,000)
Fiscal Year 2019: ¥780,000 (Direct Cost: ¥600,000、Indirect Cost: ¥180,000)
Fiscal Year 2018: ¥780,000 (Direct Cost: ¥600,000、Indirect Cost: ¥180,000)
Fiscal Year 2017: ¥1,690,000 (Direct Cost: ¥1,300,000、Indirect Cost: ¥390,000)
Keywords転移学習 / メタ学習 / パラメータ転移 / 普遍近似定理 / 群対称性 / ニューラル過程 / 畳み込み / 機械学習 / 汎化誤差 / 特徴抽出
Outline of Final Research Achievements

As the first main result, we derived theoretical bounds for transferring parametric models across domains. Notably, this can involve employing complex parametric models as feature extractors, enabling the theoretical treatment of deep neural networks and sparse coding. As the second main result, we demonstrated the universality of meta-learners when considering an algebraic property called equivariance. Equivariance naturally emerges in data processing and natural processes, and it offers the advantage of enhancing learning efficiency through processing with covariant neural architectures. As the third main result, we derived a decomposition theorem for the difference in expected risks with respect to joint distributions.

Academic Significance and Societal Importance of the Research Achievements

初めに学術的意義について述べる.転移学習は現在の機械学習や人工知能の研究において欠かせない技術である.本研究により,転移学習の理論的側面の一端が明らかになり,効率的なモデルの構築や転移学習手法の構築に資することが期待できる.特にメタ学習において同変性を用いた新規のモデルを提案したが,これはデータ内の対称性という代数的性質を学習の効率化に結びつけるために重要な結果と言える.
次に社会的な意義について述べる.転移学習技術は多数のドメインでの学習をサポートするもので,幅広い応用で成功を収めている.本研究結果は今後の転移学習の応用においてその理論的基盤の構築に貢献するものである.

Report

(7 results)
  • 2022 Annual Research Report   Final Research Report ( PDF )
  • 2021 Research-status Report
  • 2020 Research-status Report
  • 2019 Research-status Report
  • 2018 Research-status Report
  • 2017 Research-status Report
  • Research Products

    (12 results)

All 2022 2019 2018 2017

All Journal Article (4 results) (of which Int'l Joint Research: 2 results,  Peer Reviewed: 4 results,  Open Access: 1 results) Presentation (8 results) (of which Int'l Joint Research: 1 results,  Invited: 2 results)

  • [Journal Article] Robust Label Prediction via Label Propagation and Geodesic <i>k</i>-Nearest Neighbor in Online Semi-Supervised Learning2019

    • Author(s)
      WADA Yuichiro、SU Siqiang、KUMAGAI Wataru、KANAMORI Takafumi
    • Journal Title

      IEICE Transactions on Information and Systems

      Volume: E102.D Issue: 8 Pages: 1537-1545

    • DOI

      10.1587/transinf.2018EDP7424

    • NAID

      130007686445

    • ISSN
      0916-8532, 1745-1361
    • Year and Date
      2019-08-01
    • Related Report
      2019 Research-status Report
    • Peer Reviewed / Int'l Joint Research
  • [Journal Article] Spectral Embedded Deep Clustering2019

    • Author(s)
      Wada Yuichiro、Miyamoto Shugo、Nakagama Takumi、Andeol Leo、Kumagai Wataru、Kanamori Takafumi
    • Journal Title

      Entropy

      Volume: 21 Issue: 8 Pages: 795-795

    • DOI

      10.3390/e21080795

    • Related Report
      2019 Research-status Report
    • Peer Reviewed / Open Access / Int'l Joint Research
  • [Journal Article] Variable Selection for Nonparametric Learning with Power Series Kernels2019

    • Author(s)
      Matsui Kota、Kumagai Wataru、Kanamori Kenta、Nishikimi Mitsuaki、Kanamori Takafumi
    • Journal Title

      Neural Computation

      Volume: 31 Issue: 8 Pages: 1718-1750

    • DOI

      10.1162/neco_a_01212

    • Related Report
      2019 Research-status Report
    • Peer Reviewed
  • [Journal Article] Risk bound of transfer learning using parametric feature mapping and its application to sparse coding2019

    • Author(s)
      Kumagai Wataru、Kanamori Takafumi
    • Journal Title

      Machine Learning

      Volume: 108 Issue: 11 Pages: 1975-2008

    • DOI

      10.1007/s10994-019-05805-2

    • Related Report
      2019 Research-status Report
    • Peer Reviewed
  • [Presentation] Langevin Autoencoders for Learning Deep Latent Variable Models2022

    • Author(s)
      Shohei Taniguchi
    • Organizer
      Advances in Neural Information Processing Systems 35 (NeurIPS 2022)
    • Related Report
      2022 Annual Research Report
    • Int'l Joint Research
  • [Presentation] 転移学習から継続学習へ2019

    • Author(s)
      熊谷亘
    • Organizer
      科学研究費 基盤研究(A)「機械学習システムの社会実装に向けた次世代最適化技法の研究」による2019年度ワークショップ
    • Related Report
      2019 Research-status Report
    • Invited
  • [Presentation] 転移学習から継続学習へ2019

    • Author(s)
      熊谷亘
    • Organizer
      情報系 WINTER FESTA Episode 5
    • Related Report
      2019 Research-status Report
  • [Presentation] Wasserstein距離を用いた転移学習の理論解析2018

    • Author(s)
      熊谷亘
    • Organizer
      第21回情報論的学習理論ワークショップ
    • Related Report
      2018 Research-status Report
  • [Presentation] 転移学習の数理2018

    • Author(s)
      熊谷亘
    • Organizer
      第二回理研AIP数理系合同合宿
    • Related Report
      2018 Research-status Report
    • Invited
  • [Presentation] パラメータ転移学習における汎化誤差の評価2017

    • Author(s)
      熊谷亘
    • Organizer
      科研費シンポジウム「統計学, 機械学習の数理とその応用」
    • Related Report
      2017 Research-status Report
  • [Presentation] パラメータ転移学習におけるリスク上界2017

    • Author(s)
      熊谷亘
    • Organizer
      統計学・機械学習若手シンポジウム「大規模複雑データに対する統計・機械学習のアプローチ」
    • Related Report
      2017 Research-status Report
  • [Presentation] パラメータ転移学習におけるリスク上界2017

    • Author(s)
      熊谷亘
    • Organizer
      2017年度 統計関連学会連合大会
    • Related Report
      2017 Research-status Report

URL: 

Published: 2017-04-28   Modified: 2024-01-30  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi