• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to previous page

Transcriptional model of declarative memory from hippocampus to neocortex

Research Project

Project/Area Number 16K00329
Research Category

Grant-in-Aid for Scientific Research (C)

Allocation TypeMulti-year Fund
Section一般
Research Field Soft computing
Research InstitutionUniversity of Yamanashi

Principal Investigator

HATTORI Motonobu  山梨大学, 大学院総合研究部, 准教授 (40293435)

Research Collaborator NAKANO Shunta  
Project Period (FY) 2016-04-01 – 2019-03-31
Project Status Completed (Fiscal Year 2018)
Budget Amount *help
¥3,380,000 (Direct Cost: ¥2,600,000、Indirect Cost: ¥780,000)
Fiscal Year 2018: ¥780,000 (Direct Cost: ¥600,000、Indirect Cost: ¥180,000)
Fiscal Year 2017: ¥1,170,000 (Direct Cost: ¥900,000、Indirect Cost: ¥270,000)
Fiscal Year 2016: ¥1,430,000 (Direct Cost: ¥1,100,000、Indirect Cost: ¥330,000)
Keywords海馬 / 大脳皮質 / 記憶 / ニューラルネットワーク / 破局的忘却 / No-prop / CHL / 大脳皮質モデル / 擬似リハーサル / Hebb則 / 重みの重要度 / 擬似パターン / No-Prop / 宣言的記憶 / エピソード記憶 / 神経新生 / スパイクタイミング依存性シナプス可塑性 / ソフトコンピューティング
Outline of Final Research Achievements

In this study, we were aimed at modeling the complementary learning system in the hippocampus and neocortex responsible for memories about events and facts. In particular, we focused on the neocortex, which is the locus of long-term memory. We considered what kind of mechanism would be required to prevent old memories from being destroyed by new memories, and constructed biologically plausible models. By examining their characteristics by computer simulation, we have revealed that it is possible to reduce catastrophic forgetting by combining a biologically relevant learning method that does not propagate output errors backward through the network and pseudorehearsal. We have also shown that catastrophic forgetting is further reduced by considering the importance of weights.

Academic Significance and Societal Importance of the Research Achievements

意識的に思い出すことのできる様々な出来事に関する記憶(エピソード記憶)や事実に関する記憶(意味記憶)は宣言的記憶と呼ばれ,思考や推論をといった極めて高次な情報処理で用いられている.宣言的記憶は,初めに海馬に蓄えられ,その後徐々に大脳皮質へと転写されていくと考えられているが,その仕組みは未解明である.本研究では,人間のように知的で柔軟な情報処理システムの実現に向けて,その基盤となる宣言的記憶の形成過程を工学的に模倣した.特に,人間のように,古い記憶を破壊することなく,次々と新しい情報を追加的に記憶していく仕組みについて,そのモデル化を行った.

Report

(4 results)
  • 2018 Annual Research Report   Final Research Report ( PDF )
  • 2017 Research-status Report
  • 2016 Research-status Report
  • Research Products

    (8 results)

All 2019 2018 2017 2016

All Journal Article (1 results) (of which Peer Reviewed: 1 results) Presentation (7 results) (of which Int'l Joint Research: 5 results)

  • [Journal Article] Characteristics of contrastive Hebbian learning with pseudorehearsal for multilayer neural networks on reduction of catastrophic forgetting2018

    • Author(s)
      Hattori Motonobu、Nakano Shunta
    • Journal Title

      International Journal of Computational Intelligence Studies

      Volume: 7 Issue: 3/4 Pages: 289-289

    • DOI

      10.1504/ijcistudies.2018.096184

    • Related Report
      2018 Annual Research Report
    • Peer Reviewed
  • [Presentation] 擬似リハーサルと重みの重要度を用いた破局的忘却の抑制2019

    • Author(s)
      中野峻太, 服部元信
    • Organizer
      電子情報通信学会ニューロコンピューティング研究会
    • Related Report
      2018 Annual Research Report
  • [Presentation] Rduction of Catastrophic Forgetting for Multilayer Neural Networks Trained by No-Prop Algorithm2018

    • Author(s)
      Motonobu Hattori and Hideto Tsuboi
    • Organizer
      IEEE 2018 International Conference on Information and Communicatio Technology (ICOIACT)
    • Related Report
      2017 Research-status Report
    • Int'l Joint Research
  • [Presentation] ニューラルネットワークを用いた強化学習における環境状態の分布を考慮した擬似リハーサルの導入2018

    • Author(s)
      辺見航平,服部元信
    • Organizer
      情報処理学会第80回全国大会
    • Related Report
      2017 Research-status Report
  • [Presentation] Alpine Plants Recognition with Deep Convolutional Neural Network2017

    • Author(s)
      Tomoaki Negishi and Motonobu Hattori
    • Organizer
      14th International Symposium on Neural Networks
    • Related Report
      2017 Research-status Report
    • Int'l Joint Research
  • [Presentation] Reduction of Catastrophic Forgetting in Multilayer Neural Networks Trained by Contrastive Hebbian Learning with Pseudorehearsal2017

    • Author(s)
      Shunta Nakano and Motonobu Hattori
    • Organizer
      IEEE 10th International Workshop on Computational Intelligence and Applications (IWCIA)
    • Related Report
      2017 Research-status Report
    • Int'l Joint Research
  • [Presentation] Parallel Learning for Combined Knowledge Acquisition Model2016

    • Author(s)
      Kohei Henmi and Motonobu Hattori
    • Organizer
      International Conference on Neural Information Processing
    • Place of Presentation
      京都大学(京都府・京都市)
    • Year and Date
      2016-10-16
    • Related Report
      2016 Research-status Report
    • Int'l Joint Research
  • [Presentation] A Hippocampal Model for Episodic Memory using Neurogenesis and Asymmetric STDP2016

    • Author(s)
      Motonobu Hattori and Yoshinori Kobayashi
    • Organizer
      IEEE and INNS International Joint Conference on Neural Networks
    • Place of Presentation
      Vancouver, Canada
    • Year and Date
      2016-07-24
    • Related Report
      2016 Research-status Report
    • Int'l Joint Research

URL: 

Published: 2016-04-21   Modified: 2020-03-30  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi