Budget Amount *help |
¥2,900,000 (Direct Cost: ¥2,900,000)
Fiscal Year 2005: ¥1,100,000 (Direct Cost: ¥1,100,000)
Fiscal Year 2004: ¥1,000,000 (Direct Cost: ¥1,000,000)
Fiscal Year 2003: ¥800,000 (Direct Cost: ¥800,000)
|
Research Abstract |
1.Ensemble learning of K nonlinear perceptrons, which determine their outputs by sign functions, is analyzed within the framework of online learning and statistical mechanics. As a result, Hebbian learning, perceptron learning and AdaTron learning show different characteristics in their affinity for ensemble learning, that is "maintaining variety among students." Results show that AdaTron learning is superior to the other two rules with respect to that affinity. 2.Ensemble learning, in which a teacher and students are a committee machine and simple perceptrons respectively, is analyzed based on online learning theory and statistical mechanics. 3.Ensemble learning, in which a teacher and students are a non-3nonotonic perceptron and simple perceptrons respectively, is analyzed based on online learning theory and statistical mechanics. 4.The generalization performance of a new student supervised by a moving machine has been analyzed. A model composed of a fixed true teacher, a moving teacher and a student that are all linear perceptrons with noises has been treated analytically using statistical mechanics. It has been proven that the generalization errors of a student can be smaller than that of a moving teacher, even if the student only uses examples from the moving teacher. 5.The generalization performance of a student in a model composed of linear perceptrons: a true teacher, ensemble teachers, and the student has been analyzed. Calculating the generalization error of the student analytically using statistical mechanics in the framework of online learning, it is proven that when learning rate η<1, the larger the number K and the variety of the ensemble teachers are, the smaller the generalization error is. On the other hand, when η>1, the properties are completely reversed.
|