• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to previous page

Improving High-Performance Neural Network-based Sequence Classifiers

Research Project

Project/Area Number 26730093
Research Category

Grant-in-Aid for Young Scientists (B)

Allocation TypeMulti-year Fund
Research Field Perceptual information processing
Research InstitutionKyushu University

Principal Investigator

Frinken Volkmar (FRINKEN Volkmar)  九州大学, システム情報科学研究科(研究院, 学術研究員 (70724417)

Project Period (FY) 2014-04-01 – 2016-03-31
Project Status Discontinued (Fiscal Year 2015)
Budget Amount *help
¥3,770,000 (Direct Cost: ¥2,900,000、Indirect Cost: ¥870,000)
Fiscal Year 2015: ¥1,170,000 (Direct Cost: ¥900,000、Indirect Cost: ¥270,000)
Fiscal Year 2014: ¥2,600,000 (Direct Cost: ¥2,000,000、Indirect Cost: ¥600,000)
Keywordsリカレントニューラルネットワーク / 深層学習 / 時系列データ解析 / ネットワーク解析 / Machine Learning / Sequence Classification / Deep Learning / Pattern Recognition / LSTM Neural Networks
Outline of Annual Research Achievements

In the project, Improving High-Performance Neural Network-based Sequence Recognizers, I focused my effort to understand the internal dynamics of LSTM recurrent neural networks. During the course of the year, I have done my investigations along two major directions. The first research direction considered the node activations of the neural networks during the recognition of a sequence. For simple tasks, such as counting peaks in a sequence, the activation levels of the LSTM nodes can be interpreted fairly easily. Unfortunately, the same cannot be said for more complex networks and tasks, such as handwriting recognition. We could make an observation that only a subset of the nodes in the hidden layer takes on meaningful values, whereas the remaining nodes constantly increase their stored value, regardless of the input. Our hope was to detect the minimal number or meaningful hidden nodes for different tasks and establish a form of intrinsic complexity dimensionality for different databases. Unfortunately, the results were inconclusive.

The second research direction was to increase the layers for LSTM based handwriting recognition in a systematic way. We carefully analyzed various topologies with different number of hidden layers and different node activation functions. The results indicate that these meta-parameters have a significant influence. Our findings have been published and can help to improve future systems.

Report

(2 results)
  • 2015 Annual Research Report
  • 2014 Research-status Report
  • Research Products

    (2 results)

All 2015

All Journal Article (1 results) (of which Peer Reviewed: 1 results,  Acknowledgement Compliant: 1 results) Presentation (1 results)

  • [Journal Article] Deep BLSTM Neural Networks for Unconstrained Continuous Handwritten Text Recognition2015

    • Author(s)
      Volkmar Frinken and Seiichi Uchida
    • Journal Title

      Proceedings of 13th Int. Conf. Document Analysis and Recognition

      Volume: 1 Pages: 1-5

    • Related Report
      2015 Annual Research Report
    • Peer Reviewed / Acknowledgement Compliant
  • [Presentation] Deep BLSTM Neural Networks for Unconstrained Continuous Handwritten Text Recognition2015

    • Author(s)
      Volkmar Frinken
    • Organizer
      Int'l Conf. on Document Analysis and Recognition (ICDAR)
    • Place of Presentation
      Gammarth, Tunisia
    • Year and Date
      2015-08-23 – 2015-08-26
    • Related Report
      2014 Research-status Report

URL: 

Published: 2014-04-04   Modified: 2017-01-06  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi