• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to previous page

Theoretical modes for learning and memory in the microbrain system

Research Project

Project/Area Number 11168224
Research Category

Grant-in-Aid for Scientific Research on Priority Areas (A)

Allocation TypeSingle-year Grants
Review Section Biological Sciences
Research InstitutionKyushu Institute of Technology

Principal Investigator

MATSUOKA Kiyotoshi  Graduate School of Life Science and Systems Engineering, Professor, 大学院・生命体工学研究科, 教授 (90110840)

Co-Investigator(Kenkyū-buntansha) 徳成 剛  九州工業大学, 工学部, 助手 (00237075)
黒木 秀一  九州工業大学, 工学部, 助教授 (40178124)
Project Period (FY) 1999 – 2001
Project Status Completed (Fiscal Year 2002)
Budget Amount *help
¥4,200,000 (Direct Cost: ¥4,200,000)
Fiscal Year 2001: ¥1,500,000 (Direct Cost: ¥1,500,000)
Fiscal Year 2000: ¥1,500,000 (Direct Cost: ¥1,500,000)
Fiscal Year 1999: ¥1,200,000 (Direct Cost: ¥1,200,000)
Keywordslearning / memory / Hebbian learning / anti-Hebbian learning / ニューラルネット / リズム / 相互抑制
Research Abstract

The purpose of this study was to investigate learning / memory mechanisms in lower animals by means of mathematical models. In particular we have focused on a Hebb-type learning.
A typical model for Hebbian / anti-Hebbian learning is as follows : when the pre- and post-synaptic activations occur at the same time, the relevant synaptic weight increases / decreases. According to recent neuro-physiological findings, however, the change in the synaptic weight depends on the timing between pre- and post-synaptic activities. Namely, if the pre-synaptic activation precedes the post-synaptic activation, then LTP is induced. On the other hand, if the order is reversed, then LTP appears. This kind of asymmetric feature in synaptic plasticity must play a very important role in the learning of temporal patterns in animals.
In this study we first built a basic mathematical model for temporally asymmetric Hebbian learning. Using the basic model, we devised more elaborated models that would explain some neuronal phenomena, for example, (1) oscillatory behavior of a neural circuit, (2) neural integrators, (3) neural memory for certain periodic stimuli as seen in the visual system of the crayfish. Through the modeling, we were able to clarify some functional mechanisms of the temporally Hebbian learning rule.

Report

(4 results)
  • 2002 Final Research Report Summary
  • 2001 Annual Research Report
  • 2000 Annual Research Report
  • 1999 Annual Research Report

Research Products

(9 results)

All Other

All Publications (9 results)

  • [Publications] Matsuoka, K.: "A general theory of a class of linear neural nets for principal and minor component analysis"Artificial Life and Robotics. Vol.3. 246-254 (1999)

    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2002 Final Research Report Summary
  • [Publications] Matsuoka, K.: "A model analysis of temporally asymmetric Hebbian learning"Recent Advances in Simulation, Computational Methods and Soft Computing. 193-198 (2002)

    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2002 Final Research Report Summary
  • [Publications] Matsuoka, K.: "A general theory of a class of linear neural nets for principal and minor component analysis"Artificial Life and Robotics. Vol. 3. 246-254 (1999)

    • Description
      「研究成果報告書概要(欧文)」より
    • Related Report
      2002 Final Research Report Summary
  • [Publications] Matsuoka, K.: "A model analysis of temporally asymmetric Hebbian learning"Recent Advances in Simulation Computational Methods and Soft Computing. 193-198 (2002)

    • Description
      「研究成果報告書概要(欧文)」より
    • Related Report
      2002 Final Research Report Summary
  • [Publications] 松岡清利: "時間的に非対称はヘブ学習の一解析"電子情報通信学会技術研究報告. Vol.101 No.432. 9-14 (2001)

    • Related Report
      2001 Annual Research Report
  • [Publications] Kiyotoshi MATSUOKA: "A model analysis of temporally asymmetric Hebbian learning"Recent Advances in Simulation, Computational Methods and Soft Computing. 193-198 (2002)

    • Related Report
      2001 Annual Research Report
  • [Publications] K.Matsuoka: "A general theory of a class of linear neural nets for principal and minor component analysis"Artificial Life and Robotics. Vol.3. 246-254 (1999)

    • Related Report
      2000 Annual Research Report
  • [Publications] K.Matsuoka: "A kurtosis-based blind separation of sourses using the Cayleg transform"Proc.IEEE 2000 Adaptive Systems for Signal Processing, Communications, and Control Symposium. 369-374 (2000)

    • Related Report
      2000 Annual Research Report
  • [Publications] Kiyotoshi Matsuoka: "A general theory of a class of linear neural nets for principal and minor component analysis"Artificial Life and Robotics. vol3. 101-109 (1999)

    • Related Report
      1999 Annual Research Report

URL: 

Published: 1999-03-31   Modified: 2018-03-28  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi