Budget Amount *help |
¥3,900,000 (Direct Cost: ¥3,900,000)
Fiscal Year 2002: ¥600,000 (Direct Cost: ¥600,000)
Fiscal Year 2001: ¥500,000 (Direct Cost: ¥500,000)
Fiscal Year 2000: ¥2,800,000 (Direct Cost: ¥2,800,000)
|
Research Abstract |
A lot of learning machines are nonidentifiable, for example, artificial neural networks, normal mixtures, Bayesian networks, reduced rank approximations. In such learning machines, the mapping from the parameter to the probability distribution is not one-to-one, resulting that the Fisher information matrices are singular. It should be emphasized that there is no learning theory which can be applied to such singular learning machines. In this research, we developed a new learning theory which enables us to clarify the generalization errors of such learning machines. The main result is as follows. The generalization error that is defined as the average Kullback information from the true distribution to the estimated distribution can be asymptotically expanded as "a log n - (b-1)loglog n + c", where a, b, and c are constants. The important constants a and b are determined as the largest pole and its order of the zeta function of the Kullback in formation and the prior. They can be calculated by the blowing-up technology, which is well known in resolution of singularities in algebraic geometry.
|