• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to project page

1998 Fiscal Year Final Research Report Summary

A Neural Network Model with interaction between Extraneous Information and Internal Memory

Research Project

Project/Area Number 09650465
Research Category

Grant-in-Aid for Scientific Research (C)

Allocation TypeSingle-year Grants
Section一般
Research Field 計測・制御工学
Research InstitutionNagoya Institute of Technology

Principal Investigator

IWATA Akira  Nagoya Inst.of Tech., Faculty of Engineering, Professor, 工学部, 教授 (10093098)

Co-Investigator(Kenkyū-buntansha) KUROYANAGI Susumu  Nagoya Inst.of Tech., Faculty of Engineering, Research Associate, 工学部, 助手 (10283475)
MATSUO Hiroshi  Nagoya Inst.of Tech., Faculty of Engineering, Associate Professor, 工学部, 助教授 (00219396)
Project Period (FY) 1997 – 1998
KeywordsNewral Network / Sensor Fusion / Pattern Recognition / Hierarchical Structure / Incremental Learning / Symbolic Pattern
Research Abstract

In this study, we focused on the memory system of the human being and constructed a incremental learning neural network model for the symbolic patterns. The network model has a hierarchical structure and each layer has neuron blocks which recognize local features of the lower layer's output pattern. The network model does bottom-up processing and finally recognize the input pattern as the integrated information of the local features. Each neuron block is controlled by a learning trigger signal and can learn the lower layer's patterns independently. Therefore, if there are partial differences between the input pattern and the learned patterns, the network can learn only the differences between them incrementally. In the case of dealing with multi-sense information, the network model can learn only the partial information which the network model have not learned yet. In the study, we constructed the incremental learning network model. And we showed the learning ability of the model by simulations. In the first work, we tested the model by using 26 patterns as the distributed patterns that imitated the alphabet characters. As the result of the simulation, we showed the model could do incremental learning all of the patterns and recognize them after the learning process. And then, we used 200 patterns imitated the kanji characters which had "hen" and "tsukuri" structures. Each input pattern had a combination of "hen" pattern and "tsukuri" pattern, and there were about 50 "hen" patterns and 50 "tsukuri" patterns in the 200 input patterns. As the result of the simulations, the model could learn all of 200 patterns and learn only the partial differences between the "hen" or "tsukuri" patterns. We presented the above results at the IEICE meetings of technical group NC98 (Mar.1998)

  • Research Products

    (4 results)

All Other

All Publications (4 results)

  • [Publications] 北出喜章、黒柳奨、岩田彰: "大脳のコラム構造を模した追加記憶モデル"電子情報通信学会技術研究報告ニューロコンピューティング. NC-97-136. 269-276 (1998)

    • Description
      「研究成果報告書概要(和文)」より
  • [Publications] 小澤奈美、黒柳奨、岩田彰: "記号パターンのための階層型追加記憶ネットワークモデル"電子情報通信学会技術研究報告ニューロコンピューティング. NC98-160. 55-62 (1999)

    • Description
      「研究成果報告書概要(和文)」より
  • [Publications] Yoshifumi Kitade, Susumu Kuroyanagi, Akira Iwata: "Plastic Memory Model based on the Column Structure in the Cerebrum Cortex"Technical Report of IEICE. NC97-136. (1998)

    • Description
      「研究成果報告書概要(欧文)」より
  • [Publications] Nami Ozawa, Susumu Kroyanagi, Akira Iwata: "Hierarchical Block Structure Increment Learning Neural Network Model"Technical Report of IEICE. NC98-160. (1998)

    • Description
      「研究成果報告書概要(欧文)」より

URL: 

Published: 2001-10-23  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi