Learning of Bayesian Neural Networks and Their Applications to Hidden Markov Chain
Project/Area Number |
17500153
|
Research Category |
Grant-in-Aid for Scientific Research (C)
|
Allocation Type | Single-year Grants |
Section | 一般 |
Research Field |
Sensitivity informatics/Soft computing
|
Research Institution | Aichi Gakuin University |
Principal Investigator |
ITO Yoshifusa Aichi Gakuin University, Department of Policy Studies, Lecturer (10022774)
|
Co-Investigator(Kenkyū-buntansha) |
ITO Yoshifusa Aichigakuin university, Department of policy studies, Part-time Lecturer (17500153)
|
Project Period (FY) |
2005 – 2007
|
Project Status |
Completed (Fiscal Year 2007)
|
Budget Amount *help |
¥3,410,000 (Direct Cost: ¥3,200,000、Indirect Cost: ¥210,000)
Fiscal Year 2007: ¥910,000 (Direct Cost: ¥700,000、Indirect Cost: ¥210,000)
Fiscal Year 2006: ¥700,000 (Direct Cost: ¥700,000)
Fiscal Year 2005: ¥1,800,000 (Direct Cost: ¥1,800,000)
|
Keywords | Neural network / Hidden Markov chain / Bayesian decision / ベイズ判別関数 / 三層神経回路網 / 学習 / ベイズ神経回路網 / 判別関数 / 関数近似 / 多項式 |
Research Abstract |
The goal of this research was to develop a sophisticated neural network which can learn the Bayesian discriminant function and, to use it to estimate the hidden Markov chain. The results we have obtained during the period of the research supported by the Grant-in-Aid for Scientific Research c can be summarized into three points. 1. The three layer neural network, which may learn a Bayesian discriminant function, had been proposed before we started the present work. However, it had difficulty in learning. As it is a general belief that a neural network having fewer units can learn better, we first tried to decrease the hidden units. We have theoretically proved that the small number of the hidden units of our network is actually the minimum. 2.In the rase where the probability distributions are simple, this network, having the minimum number of hidden units, can be used for estimating the hidden Markov chain. When this network is equipped with parameter units, it can learn simultaneously several Bayesian discriminant functions respectively corresponding to the several states of the hidden Markov chain. 3.However, the network cannot learn the dicriminant functions in general cases. The reason is that learning with dichotomous teacher signals is difficult. So we constructed a new type of neural network, where the degree of freedom of the hidden units is limited. Though this inevitably causes an increment of hidden units, the network performs better The theory is stated in a paper which is now in printing, and the simulation results have been presented at a domestic and several international conferences. Thus, when the probability distributions are simple the network can estimate the hidden Markov chains. Even in general cases, the recent results are promising.
|
Report
(4 results)
Research Products
(27 results)