Budget Amount *help |
¥2,600,000 (Direct Cost: ¥2,600,000)
Fiscal Year 2004: ¥1,000,000 (Direct Cost: ¥1,000,000)
Fiscal Year 2003: ¥1,600,000 (Direct Cost: ¥1,600,000)
|
Research Abstract |
For artificial neural networks, pruning is important for reasons of economy, simplification, generalization ability and so on, and a number of algorithms are available today. Pruning occurs in real brains as well, and is thought to be one form of the Principle of Redundancy Reduction that presumably underlies many higher-order brain functions. From this standpoint, we have considered such inter-related subjects as selected binding, integration, abstraction and internal model induction. In particular, we have dealt with the following sub-projects by using a pruning algorithm called CSDF devised earlier by this principal investigator. (1)Analogical Learning/Inference Analogy has been studied in various areas including psychology, epistemology, pedagogy, science history, and cognitive science. AI approaches also exist but such rule-based attempts tend to be bruit-force ones and suffer from combinatorial explosion and other problems. Based on CSDF, we have developed a new method called Abstr
… More
action-Based Connectionist Analogy Processor (AB-CAP). As a result of the learning with the pruning, AB-CAP automatically creates an internal abstraction model as well as induces proper bindings between concrete and abstract entities. For instance, the internal model acts as an attractor of new relevant dataset, to allow AB-CAP to be able to deal with multiple analogies. These features have been demonstrated by a number of examples. (2)Blind Source Separation(BSS) BSS is a new IT innovation by which to extract otherwise unknown signals from their mixtures observed by sensors. The method developed in this study is fundamentally different from existing ones that are all based on information/probability theories. It uses an auto-encoder neural net which minimizes the error associated with the input-output identity mapping with the sensor signals as the input vector. CSDF is applied to the decoder part. The hidden nonlinear units that have survived the CSDF pruning will be the blind signal extractors. Furthermore, the decoder matrix reconstructs the external mixing matrix, so that the decoder part can be interpreted as an internal model of the whole external situation. The method has high adaptability and robustness, as has been shown by many simulation examples including audio and visual real-world data. (3)Application to SOM Recently, this research group has generalized the paradigm of SOM from the vector space to function space. This is called mnSOM. We have been considering incorporation of the CSDF pruning into the competitive learning associated with mnSOM. Successful results remain to be made for this part. Less
|