• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to previous page

IMPROVEMENT OF CONVERGENCE OF LEARNING OF MULTI-LAYER NEURAL NETWORKS AND APPLICATION FOR SEARCH ENGINE

Research Project

Project/Area Number 13680472
Research Category

Grant-in-Aid for Scientific Research (C)

Allocation TypeSingle-year Grants
Section一般
Research Field Intelligent informatics
Research InstitutionTOKYO METROPOLITAN COLLEGE OF TECHNOLOGY

Principal Investigator

HARA Kazuyuki  TOKYO METROPOLITAN COLLEGE OF TECHNOLOGY, DEPARTMENT OF ELECTRICAL AND INFORMATION ENGINEERING, ASSOCIATE PROFESSOR, 電子情報工学科, 助教授 (30311004)

Co-Investigator(Kenkyū-buntansha) NAKAYAMA Kenji  KANAZAWA UNIVERSITY, GRADUATE SCHOOL OF NATURAL SCIENCE AND TECHNOLOGY, PROFESSOR., 大学院・自然科学研究科, 教授 (00207945)
Project Period (FY) 2001 – 2002
Project Status Completed (Fiscal Year 2002)
Budget Amount *help
¥1,300,000 (Direct Cost: ¥1,300,000)
Fiscal Year 2002: ¥500,000 (Direct Cost: ¥500,000)
Fiscal Year 2001: ¥800,000 (Direct Cost: ¥800,000)
KeywordsMultilayer nerural networks / Error distribution / improving convergence of learning / margin / ensemble learning / on-line learning / 単純パーセプトロン / 階層形ニューラルネットワーク / 収束性の改善 / 例題数の偏り / 対称性の破壊
Research Abstract

In this study, we investigated improvement of convergence of learning of the multi-layer neural networks and its application for search engine. Abstract of the results are follows :
(1) Baised Classification with number of the data in the class. Learning method of probability of updating the connection weight is proportional to the mean equated error, is investigated. It keeps balance of the number of the data express large error and small error, then minority class becomes learnable.
(2)Learning method to obtain Early symmetry braking.
We investigated the learning method for mutilayer neural networks updating only one connection weight to avoid the stopping of the learning.
(3)Perceptron learning with a margin.
We introduced a margin a la Gardner to improve the perceptron learning. Our algorithm is superior to Hebbian learning at the early stage of the learning.
(4)Analysis of ensemble learning through linear learning machine.
We analyzed generalization error of ensemble learning related to the number of the weak learner K. As the result, it has been shown that at the limit of K goes to infinity, the generalization error is a half of that of single percentron.

Report

(3 results)
  • 2002 Annual Research Report   Final Research Report Summary
  • 2001 Annual Research Report
  • Research Products

    (20 results)

All Other

All Publications (20 results)

  • [Publications] Kazuyuki Hara, Kenji Nakayama: "A learning method by statistic connection weight update"Proceedings of International Conference on Neural Networks. 2036-2041 (2001)

    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2002 Final Research Report Summary
  • [Publications] Kazuyuki Hara: "A novel line search type algorithm avoidable of small local minima"Proceedings of International Conference on Neural Networks. 2048-2053 (2001)

    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2002 Final Research Report Summary
  • [Publications] 原一之, 岡田真人: "マージンを用いた単純パーセプトロンのオンラインラーニングの理論"電子情報通信学会 論文誌 D-II. J85-DII・10. 1563-1570 (2002)

    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2002 Final Research Report Summary
  • [Publications] Kazuyuki Hara, Masato Okada: "On-line learning through simple perceptron with a margin"Proceedings of International Conference of Neural Information Processing. 1. 158-164 (2002)

    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2002 Final Research Report Summary
  • [Publications] 原 一之, 岡田 真人: "オンライン学習理論に基づく線形学習機械のアンサンブル学習の解析"日本物理学会秋期大会概要集. 221 (2002)

    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2002 Final Research Report Summary
  • [Publications] 原 一之, 岡田 真人: "パラレルブースティングのオンラインラーニングの理論"日本物理学会年次大会概要集. 258 (2003)

    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2002 Final Research Report Summary
  • [Publications] KAZUYUKI HARA, KENJI NAKAYAMA: "A LEARNING METHOD BY STATISTIC CONNECTION WEIGHT UPDATE"PROCEEDINGS OF INTERNATIONAL CONFERENCE OF NEURAL NETWORKS. 2036-2041 (2001)

    • Description
      「研究成果報告書概要(欧文)」より
    • Related Report
      2002 Final Research Report Summary
  • [Publications] KAZUYUKI HARA: "A NOVEL LINE SEARCH TYPE ALGORITHM AVOIDABLE OF SMALL LOCAL MINIMA"PROCEEDINGS OF INTERNATIONAL CONFERENCE OF NEURAL NETWORKS. 2048-2053 (2001)

    • Description
      「研究成果報告書概要(欧文)」より
    • Related Report
      2002 Final Research Report Summary
  • [Publications] KAZUYUKI HARA, MASATO OKADA: "ON-LINE LEARNING OF A SIMPLE PERCEPTRON WITH MARGIN"TRANSACTION OF INSTITUTE OF ELECTONICS, INFORMATION AND COMMUNICATION ENGINEER. J85-DII, 10. 1563-1570 (2002)

    • Description
      「研究成果報告書概要(欧文)」より
    • Related Report
      2002 Final Research Report Summary
  • [Publications] KAZUYUKI HARA, MASATO OKADA: "ON-LINE LEARNING THROUGH SIMPLE PERCEPTRON WITH A MARGIN"PROCEEDINGS OF INTERNATIONAL CONFERENCE OF NEURAL INFORMATION PROCESSING.

    • Description
      「研究成果報告書概要(欧文)」より
    • Related Report
      2002 Final Research Report Summary
  • [Publications] KAZUYUKI HARA, MASATO OKADA: "ANALYSIS OF EMSEMBLE LEARNIG THROUGH LINEAR LEARNING MACHINE"MEETING ABSTRACTS OF THE PHYSICAL SOCIETY OF JAPAN. VOL. 57, ISSUE2, PART 2. 221

    • Description
      「研究成果報告書概要(欧文)」より
    • Related Report
      2002 Final Research Report Summary
  • [Publications] KAZUYUKI HARA, MASATO OKADA: "ONLINE LEARNING OF EMSEMBLE LEARNING BY PARALLEL BOOSTING"MEETING ABSTRACTS OF THE PHYSICAL SOCIETY OF JAPAN. VOL. 58, ISSUE1, PART 2. 258

    • Description
      「研究成果報告書概要(欧文)」より
    • Related Report
      2002 Final Research Report Summary
  • [Publications] 原 一之, 岡田 真人: "マージンを用い単純パーセプトロン学習法のオンラインラーニングの理論"電子情報通信学会論文誌. J85-D-II-No.10. 1563-1570 (2002)

    • Related Report
      2002 Annual Research Report
  • [Publications] Kazuyuki Hara, Masato Okada: "On-Line learning through simple perceptron learning with a margin"Proceeding of International Conference on Neural Information Processing 2002. 1. 158-164 (2002)

    • Related Report
      2002 Annual Research Report
  • [Publications] 原 一之, 岡田 真人: "線形ウィークラーナーによるアンサンブル学習の汎化誤差の解析"情報論的学習理論ワークショップ予稿集. 113-118 (2002)

    • Related Report
      2002 Annual Research Report
  • [Publications] 原 一之, 岡田真人: "線形ウイークラーナーによるアンサンブル学習の汎化誤差の解析"日本物理学会秋季大会概要集. 2. 221 (2002)

    • Related Report
      2002 Annual Research Report
  • [Publications] 原 一之, 岡田 真人: "パラレルブースティングのオンラインラーニングの理論"日本物理学会第58回年次大会概要集. 2. 258 (2003)

    • Related Report
      2002 Annual Research Report
  • [Publications] Aazuyuki Hara, Yoshihisa Amakata, Ryohei Nukaga, Kenji Nakavama: "A learning method by stochastic connection weight update"Proceedings of International joint conference on neural network. 2036-2041 (2001)

    • Related Report
      2001 Annual Research Report
  • [Publications] Kazuyuki Hara, Hiroyuki Nose, Megumi Ohwada: "A Novel Line Search Type Algorithm Avoidable of Small Local Minima"Proceedings of International joint conference on neural network. 2048-2053 (2001)

    • Related Report
      2001 Annual Research Report
  • [Publications] 原一之, 岡田真人: "マージンを用いた学習法アルゴリズム"日本神経回路学会全国大会. (予稿). 44-45 (2001)

    • Related Report
      2001 Annual Research Report

URL: 

Published: 2001-04-01   Modified: 2016-04-21  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi