• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to previous page

On model selection criteria under shrinkage estimation in greedy learning

Research Project

Project/Area Number 18K11433
Research Category

Grant-in-Aid for Scientific Research (C)

Allocation TypeMulti-year Fund
Section一般
Review Section Basic Section 61030:Intelligent informatics-related
Research InstitutionMie University

Principal Investigator

Hagiwara Katsuyuki  三重大学, 教育学部, 教授 (60273348)

Project Period (FY) 2018-04-01 – 2021-03-31
Project Status Completed (Fiscal Year 2020)
Budget Amount *help
¥2,990,000 (Direct Cost: ¥2,300,000、Indirect Cost: ¥690,000)
Fiscal Year 2020: ¥780,000 (Direct Cost: ¥600,000、Indirect Cost: ¥180,000)
Fiscal Year 2019: ¥1,300,000 (Direct Cost: ¥1,000,000、Indirect Cost: ¥300,000)
Fiscal Year 2018: ¥910,000 (Direct Cost: ¥700,000、Indirect Cost: ¥210,000)
Keywordsスパース学習 / 階層型ニューラルネット / モデル選択 / 正則化 / 縮小推定 / SURE / thresholding / LASSO / Thresholding法 / ニューラルネットワーク / スパース正則化 / 貪欲法
Outline of Final Research Achievements

In this project, we considered a model selection problem that is common for both of layered neural nets and sparse modeling. We considered model selection under regularization and shrinkage methods. In a sparse modeling, we derived a scaling method for LASSO, in which a bias problem is relaxed. And, we derived a risk-based model selection criterion for the estimate under the proposed method. We confirm its effectiveness through numerical experiments. Additionally, by introducing a scaling method, we derived a unified modeling method under a non-parametric orthogonal regression problem and we analyzed the generalization properties of the proposed method. On the other hand, in layered neural nets, we found that a deep structure affects over-fitting to noise.

Academic Significance and Societal Importance of the Research Achievements

最近、機械学習分野では、深層学習およびスパース学習という二つのキーワードが注目されており、社会的にもインパクトを与えているが、モデル選択の問題はまだ研究が続いている。これらは独立に発展しているが、深層学習の基本である階層型のニューラルネットとスパース学習は、いずれも、学習によって選択できる関数達の線形結合により構成されるモデルを考えているという共通点をもつ。これまでの研究で、こうしたモデルを貪欲法の下で学習した場合、予測誤差の推定値がモデル選択規準として応用可能な形にならないことが知られている。本研究は、この問題を解決するために、縮小推定を導入した下でのモデル選択を考えるものである。

Report

(4 results)
  • 2020 Annual Research Report   Final Research Report ( PDF )
  • 2019 Research-status Report
  • 2018 Research-status Report
  • Research Products

    (7 results)

All 2021 2019 2018

All Journal Article (4 results) (of which Int'l Joint Research: 1 results,  Peer Reviewed: 1 results,  Open Access: 1 results) Presentation (3 results) (of which Int'l Joint Research: 2 results)

  • [Journal Article] Bridging between soft and hard thresholding by scaling2021

    • Author(s)
      Katsuyuki Hagiwara
    • Journal Title

      arXiv:2104.09703

      Volume: arXiv:2104.09703 Pages: 1-10

    • Related Report
      2020 Annual Research Report
  • [Journal Article] A Model Selection Criterion for LASSO Estimate with Scaling2019

    • Author(s)
      Katsuyuki Hagiwara
    • Journal Title

      Proceedings of Neural Information Processing 26th International Conference

      Volume: II Pages: 248-259

    • Related Report
      2019 Research-status Report
    • Peer Reviewed / Int'l Joint Research
  • [Journal Article] On an improvement of LASSO by scaling2018

    • Author(s)
      Katsuyuki Hagiwara
    • Journal Title

      arXiv:1808.07260

    • Related Report
      2018 Research-status Report
    • Open Access
  • [Journal Article] On a Fitting of a Heaviside Function by Deep ReLU Neural Networks2018

    • Author(s)
      Hagiwara Katsuyuki
    • Journal Title

      L. Cheng et al. (Eds.): ICONIP 2018, LNCS 11301

      Volume: I Pages: 59-69

    • DOI

      10.1007/978-3-030-04167-0_6

    • ISBN
      9783030041663, 9783030041670
    • Related Report
      2018 Research-status Report
  • [Presentation] A model selection criterion for LASSO estimate with scaling2019

    • Author(s)
      Katsuyuki Hagiwara
    • Organizer
      電子情報通信学会 情報論的学習理論と機械学習研究会
    • Related Report
      2019 Research-status Report
  • [Presentation] A Model Selection Criterion for LASSO Estimate with Scaling2019

    • Author(s)
      Katsuyuki Hagiwara
    • Organizer
      ICONIP2019
    • Related Report
      2019 Research-status Report
    • Int'l Joint Research
  • [Presentation] On a Fitting of a Heaviside Function by Deep ReLU Neural Networks2018

    • Author(s)
      Katsuyuki Hagiwara
    • Organizer
      ICONIP2018
    • Related Report
      2018 Research-status Report
    • Int'l Joint Research

URL: 

Published: 2018-04-23   Modified: 2022-01-27  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi