Budget Amount *help |
¥2,500,000 (Direct Cost: ¥2,200,000、Indirect Cost: ¥300,000)
Fiscal Year 2007: ¥1,300,000 (Direct Cost: ¥1,000,000、Indirect Cost: ¥300,000)
Fiscal Year 2006: ¥1,200,000 (Direct Cost: ¥1,200,000)
|
Research Abstract |
The multi-layer perceptron is known to be a singular model in which the Fisher information matrix can be singular in some cases. The singularity comes from parametric basis functions by which the shapes of basis functions vary. To reveal statistical properties of a singular model, in this research, we focus on the variable basis functions. We derived a upper bound of the degree of over-fitting under a restriction on basis function outputs. For example, the restriction can be achieved by restricting an input weight space in a multi-layer perceptron. By applying this result, we showed that, for a regression using one Gaussian unit, a very small value is frequently obtained for a width parameter in training. On the other hand, we derived the expectations of the training error and generalization error of learning machine in which basis functions which are chosen from a finite set of orthogonal functions. This clarifies that AIC type model selection criteria for machines with variable basis functions need an information of a target function. We solve this problem by applying a shrinkage method. From a viewpoint of variable basis functions, furthermore, we proposed a shrinkage method for a nonparametric regression. The method produces a machine with low generalization error in less computational time. The results obtained in this research helps for clarifying statistical properties of a machine with variable basis functions, thus, a singular model such as multi-layer perceptrons.
|