• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to previous page

Aresearch of statistical properties of singular models such as a multi-layer perceptron

Research Project

Project/Area Number 18500171
Research Category

Grant-in-Aid for Scientific Research (C)

Allocation TypeSingle-year Grants
Section一般
Research Field Sensitivity informatics/Soft computing
Research InstitutionMie University

Principal Investigator

HAGIWARA Katsuyuki  Mie University, Faculty of Education, Associate Prof. (60273348)

Project Period (FY) 2006 – 2007
Project Status Completed (Fiscal Year 2007)
Budget Amount *help
¥2,500,000 (Direct Cost: ¥2,200,000、Indirect Cost: ¥300,000)
Fiscal Year 2007: ¥1,300,000 (Direct Cost: ¥1,000,000、Indirect Cost: ¥300,000)
Fiscal Year 2006: ¥1,200,000 (Direct Cost: ¥1,200,000)
Keywordssingular model / multi-laver perceptron / model selection / training error / generalization error / shrinkage estimator / over-fitting / extreme value theory / 可変基底 / スレシュホ一ルディング / ノンパラメトリック回帰 / プラトー
Research Abstract

The multi-layer perceptron is known to be a singular model in which the Fisher information matrix can be singular in some cases. The singularity comes from parametric basis functions by which the shapes of basis functions vary. To reveal statistical properties of a singular model, in this research, we focus on the variable basis functions. We derived a upper bound of the degree of over-fitting under a restriction on basis function outputs. For example, the restriction can be achieved by restricting an input weight space in a multi-layer perceptron. By applying this result, we showed that, for a regression using one Gaussian unit, a very small value is frequently obtained for a width parameter in training. On the other hand, we derived the expectations of the training error and generalization error of learning machine in which basis functions which are chosen from a finite set of orthogonal functions. This clarifies that AIC type model selection criteria for machines with variable basis functions need an information of a target function. We solve this problem by applying a shrinkage method. From a viewpoint of variable basis functions, furthermore, we proposed a shrinkage method for a nonparametric regression. The method produces a machine with low generalization error in less computational time. The results obtained in this research helps for clarifying statistical properties of a machine with variable basis functions, thus, a singular model such as multi-layer perceptrons.

Report

(3 results)
  • 2007 Annual Research Report   Final Research Report Summary
  • 2006 Annual Research Report
  • Research Products

    (13 results)

All 2008 2007 2006 Other

All Journal Article (8 results) (of which Peer Reviewed: 4 results) Presentation (5 results)

  • [Journal Article] Relation between weight size and degree of over-fitting in neural network regression2008

    • Author(s)
      Katsuyuki Hagiwara, Kenji Fukumizu
    • Journal Title

      Neural Networks 21

      Pages: 48-58

    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2007 Final Research Report Summary
    • Peer Reviewed
  • [Journal Article] Relation between weight size and degree of over-fitting in neural network regression2008

    • Author(s)
      Katsuyuki, Hagiwara, Kenji, Fukumizu
    • Journal Title

      Neural Networks 21

      Pages: 48-58

    • Description
      「研究成果報告書概要(欧文)」より
    • Related Report
      2007 Final Research Report Summary
  • [Journal Article] Relation between weight sizeand, degree of over-fitting in neuralnetwork regression2008

    • Author(s)
      Katsuyuki Hagiwara and Kenji Fukumizu
    • Journal Title

      Neural Networks 21

      Pages: 48-58

    • Related Report
      2007 Annual Research Report
    • Peer Reviewed
  • [Journal Article] On the expected prediction error of orthogonal regression with variable components2006

    • Author(s)
      Katsuyuki Hagiwara, Hiroshi Ishitani
    • Journal Title

      IEICE Trans. Fundamentals 89-A

      Pages: 3699-3709

    • NAID

      110007537877

    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2007 Final Research Report Summary 2006 Annual Research Report
    • Peer Reviewed
  • [Journal Article] On the expected prediction error of orthogonal regression with variable components2006

    • Author(s)
      Katsuyuki, Hagiwara, Hiroshi, Ishitani
    • Journal Title

      IEICE Trans. Fundamentals 89-A

      Pages: 3699-3709

    • NAID

      110007537877

    • Description
      「研究成果報告書概要(欧文)」より
    • Related Report
      2007 Final Research Report Summary
  • [Journal Article] ニューラルネットワークの基礎と理論的に重要な課題2006

    • Author(s)
      萩原克幸
    • Journal Title

      プラズマ・核融合学会誌 82

      Pages: 282-286

    • NAID

      110006282063

    • Related Report
      2006 Annual Research Report
  • [Journal Article] Orthogonal shrinkage methods for nonparametric regression under Gaussian noise

    • Author(s)
      Katsuyuki Hagiwara
    • Journal Title

      Proceedings of ICONIP2007, Lecture note in computer science, Springer (印刷中)

    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2007 Final Research Report Summary
    • Peer Reviewed
  • [Journal Article] Orthogonal shrinkage methods for nonparametric regression under Gaussian noise

    • Author(s)
      Katsuyuki, Hagiwara
    • Journal Title

      Proceedings of International Conference on Neural Information Processing 2007, Lecture note in computer science, Springer (to appear)

    • Description
      「研究成果報告書概要(欧文)」より
    • Related Report
      2007 Final Research Report Summary
  • [Presentation] Orthogonal shrinkage methods for nonparametric regressionunder Gaussian noise2007

    • Author(s)
      Katsuyuki Hagiwara
    • Organizer
      14th International Conference onNeural Information Processing
    • Place of Presentation
      Kitakyushu, Japan
    • Year and Date
      2007-11-16
    • Related Report
      2007 Annual Research Report
  • [Presentation] Estimation of the expected prediction error of orthogonal regression with variable components2007

    • Author(s)
      Katsuyuki Hagiwara, Hiroshi Ishitani
    • Organizer
      2007 Hawaii International Conference on Statistics, Mathematics and Related Fields
    • Place of Presentation
      Hawaii, USA
    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2007 Final Research Report Summary
  • [Presentation] Orthogonal shrinkage methods for nonparametric regression under Gaussian noise2007

    • Author(s)
      Katsuyuki Hagiwara
    • Organizer
      International Conference on Neural Information Processing 2007
    • Place of Presentation
      Kitakyushu, Japan
    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2007 Final Research Report Summary
  • [Presentation] Estimation of the expected prediction error of orthogonal regression with variable components2007

    • Author(s)
      Katsuyuki, Hagiwara, Hiroshi, Ishitani
    • Organizer
      Hawaii International Conference on Statistics, Mathematics and Related Fields
    • Place of Presentation
      Hawaii, USA
    • Description
      「研究成果報告書概要(欧文)」より
    • Related Report
      2007 Final Research Report Summary
  • [Presentation] Orthogonal shrinkage methods for nonparametric regressi on under Gaussian noise2007

    • Author(s)
      Katsuyuki, Hagiwara
    • Organizer
      International Conference on Neural Information Processing
    • Place of Presentation
      Kitakyushu, Japan
    • Description
      「研究成果報告書概要(欧文)」より
    • Related Report
      2007 Final Research Report Summary

URL: 

Published: 2006-04-01   Modified: 2016-04-21  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi