• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to project page

2007 Fiscal Year Final Research Report Summary

Aresearch of statistical properties of singular models such as a multi-layer perceptron

Research Project

Project/Area Number 18500171
Research Category

Grant-in-Aid for Scientific Research (C)

Allocation TypeSingle-year Grants
Section一般
Research Field Sensitivity informatics/Soft computing
Research InstitutionMie University

Principal Investigator

HAGIWARA Katsuyuki  Mie University, Faculty of Education, Associate Prof. (60273348)

Project Period (FY) 2006 – 2007
Keywordssingular model / multi-laver perceptron / model selection / training error / generalization error / shrinkage estimator / over-fitting / extreme value theory
Research Abstract

The multi-layer perceptron is known to be a singular model in which the Fisher information matrix can be singular in some cases. The singularity comes from parametric basis functions by which the shapes of basis functions vary. To reveal statistical properties of a singular model, in this research, we focus on the variable basis functions. We derived a upper bound of the degree of over-fitting under a restriction on basis function outputs. For example, the restriction can be achieved by restricting an input weight space in a multi-layer perceptron. By applying this result, we showed that, for a regression using one Gaussian unit, a very small value is frequently obtained for a width parameter in training. On the other hand, we derived the expectations of the training error and generalization error of learning machine in which basis functions which are chosen from a finite set of orthogonal functions. This clarifies that AIC type model selection criteria for machines with variable basis functions need an information of a target function. We solve this problem by applying a shrinkage method. From a viewpoint of variable basis functions, furthermore, we proposed a shrinkage method for a nonparametric regression. The method produces a machine with low generalization error in less computational time. The results obtained in this research helps for clarifying statistical properties of a machine with variable basis functions, thus, a singular model such as multi-layer perceptrons.

  • Research Products

    (10 results)

All 2008 2007 2006 Other

All Journal Article (6 results) (of which Peer Reviewed: 3 results) Presentation (4 results)

  • [Journal Article] Relation between weight size and degree of over-fitting in neural network regression2008

    • Author(s)
      Katsuyuki Hagiwara, Kenji Fukumizu
    • Journal Title

      Neural Networks 21

      Pages: 48-58

    • Description
      「研究成果報告書概要(和文)」より
    • Peer Reviewed
  • [Journal Article] Relation between weight size and degree of over-fitting in neural network regression2008

    • Author(s)
      Katsuyuki, Hagiwara, Kenji, Fukumizu
    • Journal Title

      Neural Networks 21

      Pages: 48-58

    • Description
      「研究成果報告書概要(欧文)」より
  • [Journal Article] On the expected prediction error of orthogonal regression with variable components2006

    • Author(s)
      Katsuyuki Hagiwara, Hiroshi Ishitani
    • Journal Title

      IEICE Trans. Fundamentals 89-A

      Pages: 3699-3709

    • Description
      「研究成果報告書概要(和文)」より
    • Peer Reviewed
  • [Journal Article] On the expected prediction error of orthogonal regression with variable components2006

    • Author(s)
      Katsuyuki, Hagiwara, Hiroshi, Ishitani
    • Journal Title

      IEICE Trans. Fundamentals 89-A

      Pages: 3699-3709

    • Description
      「研究成果報告書概要(欧文)」より
  • [Journal Article] Orthogonal shrinkage methods for nonparametric regression under Gaussian noise

    • Author(s)
      Katsuyuki Hagiwara
    • Journal Title

      Proceedings of ICONIP2007, Lecture note in computer science, Springer (印刷中)

    • Description
      「研究成果報告書概要(和文)」より
    • Peer Reviewed
  • [Journal Article] Orthogonal shrinkage methods for nonparametric regression under Gaussian noise

    • Author(s)
      Katsuyuki, Hagiwara
    • Journal Title

      Proceedings of International Conference on Neural Information Processing 2007, Lecture note in computer science, Springer (to appear)

    • Description
      「研究成果報告書概要(欧文)」より
  • [Presentation] Orthogonal shrinkage methods for nonparametric regression under Gaussian noise2007

    • Author(s)
      Katsuyuki Hagiwara
    • Organizer
      International Conference on Neural Information Processing 2007
    • Place of Presentation
      Kitakyushu, Japan
    • Year and Date
      20071100
    • Description
      「研究成果報告書概要(和文)」より
  • [Presentation] Orthogonal shrinkage methods for nonparametric regressi on under Gaussian noise2007

    • Author(s)
      Katsuyuki, Hagiwara
    • Organizer
      International Conference on Neural Information Processing
    • Place of Presentation
      Kitakyushu, Japan
    • Year and Date
      20071100
    • Description
      「研究成果報告書概要(欧文)」より
  • [Presentation] Estimation of the expected prediction error of orthogonal regression with variable components2007

    • Author(s)
      Katsuyuki Hagiwara, Hiroshi Ishitani
    • Organizer
      2007 Hawaii International Conference on Statistics, Mathematics and Related Fields
    • Place of Presentation
      Hawaii, USA
    • Year and Date
      20070100
    • Description
      「研究成果報告書概要(和文)」より
  • [Presentation] Estimation of the expected prediction error of orthogonal regression with variable components2007

    • Author(s)
      Katsuyuki, Hagiwara, Hiroshi, Ishitani
    • Organizer
      Hawaii International Conference on Statistics, Mathematics and Related Fields
    • Place of Presentation
      Hawaii, USA
    • Year and Date
      20070100
    • Description
      「研究成果報告書概要(欧文)」より

URL: 

Published: 2010-02-04  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi