2015 Fiscal Year Annual Research Report
Improving High-Performance Neural Network-based Sequence Classifiers
Project/Area Number |
26730093
|
Research Institution | Kyushu University |
Principal Investigator |
Frinken Volkmar 九州大学, システム情報科学研究科(研究院, 学術研究員 (70724417)
|
Project Period (FY) |
2014-04-01 – 2016-03-31
|
Keywords | リカレントニューラルネットワーク / 深層学習 / 時系列データ解析 / ネットワーク解析 |
Outline of Annual Research Achievements |
In the project, Improving High-Performance Neural Network-based Sequence Recognizers, I focused my effort to understand the internal dynamics of LSTM recurrent neural networks. During the course of the year, I have done my investigations along two major directions. The first research direction considered the node activations of the neural networks during the recognition of a sequence. For simple tasks, such as counting peaks in a sequence, the activation levels of the LSTM nodes can be interpreted fairly easily. Unfortunately, the same cannot be said for more complex networks and tasks, such as handwriting recognition. We could make an observation that only a subset of the nodes in the hidden layer takes on meaningful values, whereas the remaining nodes constantly increase their stored value, regardless of the input. Our hope was to detect the minimal number or meaningful hidden nodes for different tasks and establish a form of intrinsic complexity dimensionality for different databases. Unfortunately, the results were inconclusive.
The second research direction was to increase the layers for LSTM based handwriting recognition in a systematic way. We carefully analyzed various topologies with different number of hidden layers and different node activation functions. The results indicate that these meta-parameters have a significant influence. Our findings have been published and can help to improve future systems.
|