Budget Amount *help |
¥3,100,000 (Direct Cost: ¥3,100,000)
Fiscal Year 1998: ¥500,000 (Direct Cost: ¥500,000)
Fiscal Year 1997: ¥2,600,000 (Direct Cost: ¥2,600,000)
|
Research Abstract |
The author proposed some artificial neural network models integrating the two important paradigms such as "mapping" and "relaxation" ; in these models, multiple Hopfield networks are coupled by multi-layered internetworks with non-linear hidden neural cells that produce non-linear mapping. The models were applied to a class of tasks such as associative memory, and it has been shown that heterogeneous architecture based on the coupling of module sub-networks results in good performance in the storing and recalling process. The positions of memorized data (attractors) are given preliminarily in this class of tasks. Therefore, the effectiveness of introducing a mapping function to the framework of relaxation dynamics can be expected to some extent. However, in a different class of tasks such as optimization problems, the positions of fixed point attractors are not given, and it is a task in itself to search for these positions. This means that it is difficult to effectively employ the mapping function realized by multi-layered internetworks with non-linear hidden neural cells. Therefore, the CCHN models cannot be applied to optimization tasks without any changes, and another approach is necessary for solving problems for local minima such as the dependence on initial cell-states. In this research, we pay attention to "redundancy" in neural architecture, and have discussed the relationship between redundancy and neural activitiy using a simple module-based neural network as an example. Through simulation and analytical studies, we have obtained the following results : With increased modules (as redundancy in neural architecuture is enhanced), the energy surface can be improved so that the basin size of the global minimum is enlarged and that of the local minimum is diminished. As a result, the independence from initial cell-states becomes stronger.
|