• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to project page

2001 Fiscal Year Final Research Report Summary

Generalization Capability of Memorization Leaning

Research Project

Project/Area Number 11480072
Research Category

Grant-in-Aid for Scientific Research (B)

Allocation TypeSingle-year Grants
Section一般
Research Field Intelligent informatics
Research InstitutionTokyo Institute of Technology

Principal Investigator

OGAWA Hidemitsu  Tokyo Institute of Technology, Department of Computer Science, Professor, 大学院・情報理工学研究科, 教授 (50016630)

Co-Investigator(Kenkyū-buntansha) SUGIYAMA Masashi  Tokyo Institute of Technology, Department of Computer Science, Research Associate, 大学院・情報理工学研究科, 助手 (90334515)
HIRABAYASHI Akira  Yamaguchi University, Department of Computer Science and Systems Engineering, Lecturer, 工学部, 講師 (50272688)
KUMAZAWA Itsuo  Tokyo Institute of Technology, Department of Computer Science, Associate Professor, 大学院・情報理工学研究科, 助教授 (70186469)
Project Period (FY) 1999 – 2001
Keywordssupervised learning / generalization capability / error back-propagation / memorization learning / admissibility / a family of projection learnings / subspace information criterion / SIC
Research Abstract

The purpose of supervised learning is to be able to answer correctly to queries that are not necessarily included in the training examples, i.e., to acquire a higher level of the generalization capability. However, most of the learning methods such as the error back-propagation are the so-called memorization learning, which is aimed at reducing the error only for training examples. Therefore, there is no theoretical guarantee for optimal generalization.
This gives rise to the following problems: First is to clarify the reason why a higher level of the generalization capability can be acquired by the memorization learning despite the fact that it does not require the generalization capability. The second problem is to clarify the range of applicability that: the memorization learning works effectively. Third is to develop methods for further expanding the range of applicability.
For the first problem, we gave a lucid explanation by introducing the concept of admissibility. For the second … More problem, we introduced the concept of a family of projection learnings which allows us to theoretically discuss an infinitely many kinds of learning methods simultaneously. Utilizing the concepts of admissibility and a family of projection learning, we clarified the range of applicability of the memorization learning in the narrow sense with respect to a family of projection learnings. For the third problem, we showed that there are a large number of solutions : We extended the concept of the memorization learning from the rote memorization learning to the error corrected memorization learning, which further enlarges the range of applicability. From the view point of active learning, we gave design methods of training examples that maximally enhance the generalization capability. Furthermore, from the standpoint of model selection, we proposed the subspace information criterion (SIC) , which is a model selection criterion with its effectiveness theoretically guaranteed for a finite number of training examples. Based on SIC, we gave, for example, a design method of the optimal regularization parameter. Less

  • Research Products

    (14 results)

All Other

All Publications (14 results)

  • [Publications] MasashiSugiyama, Hidemitsu Ogawa: "Active learning for optimal generalization in trigonometric polynomial models"IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences. E84-A. 2319-2329 (2001)

    • Description
      「研究成果報告書概要(和文)」より
  • [Publications] Akiko Nakashima, Akira Hirabayashi, Hidemitsu Ogawa: "Error correcting memorization learning for noisy training examples"Neural Networks. 14. 79-92 (2001)

    • Description
      「研究成果報告書概要(和文)」より
  • [Publications] Masashi Sugiyama, Hidemitsu Ogawa: "Incremental projection learning for optimal generalization"Neural Networks. 14. 53-66 (2001)

    • Description
      「研究成果報告書概要(和文)」より
  • [Publications] Masashi Sugiyama, Hidemitsu Ogawa: "Subspace information criterion for model selection"Neural Computation. 13. 1863-1889 (2001)

    • Description
      「研究成果報告書概要(和文)」より
  • [Publications] Masashi Sugiyama, Hidemitsu Ogawa: "Incremental active learning for optimal generalization"Neural Computation. 12. 2909-2940 (2000)

    • Description
      「研究成果報告書概要(和文)」より
  • [Publications] 平林晃, 小川英光: "射影学習族"電子情報通信学会論文誌(D-II). 12. 754-767 (2000)

    • Description
      「研究成果報告書概要(和文)」より
  • [Publications] 小川英光: "脳科学大事典(甘利俊一, 外山敬介編)"朝倉書店. 1006 (2000)

    • Description
      「研究成果報告書概要(和文)」より
  • [Publications] Masashi Sugiyama and Hidemitsu Ogawa: "Active learning for optimal generalization in trigonometric polynomial models"IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences. vol.E84-A, no.9. 2319-2329 (2001)

    • Description
      「研究成果報告書概要(欧文)」より
  • [Publications] Akiko Nakashima, Akira Hirabayashi, and Hidemitsu Ogawa: "Error correcting memorization learning for noisy training examples"Neural Networks. vol.14, no.1. 79-92 (2001)

    • Description
      「研究成果報告書概要(欧文)」より
  • [Publications] Masashi Sugiyama and Hidemitsu Ogawa: "Incremental projection learning for optimal generalization"Neural Networks. vol.14, no.1. 53-66 (2001)

    • Description
      「研究成果報告書概要(欧文)」より
  • [Publications] Masashi Sugiyama and Hidemitsu Ogawa: "Subspace information criterion for model selection"Neural Computation. vol.13, no.8. 1863-1889 (2001)

    • Description
      「研究成果報告書概要(欧文)」より
  • [Publications] Masashi Sugiyama and Hidemitsu Ogawa: "Incremental active learning for optimal generalization"Neural Computation. vol.12, no.12. 2909-2940 (2000)

    • Description
      「研究成果報告書概要(欧文)」より
  • [Publications] Akira Hirabayashi and Hidemitsu Ogawa: "A family of projection learnings"IEICE Transactions D-II. vol.J83-D-II, no.2. 754-767 (2000)

    • Description
      「研究成果報告書概要(欧文)」より
  • [Publications] Hidemitsu Ogawa: "Encyclopedia of Brain Science (Shin-ichi Amari and Toyama Keisuke, Eds.)"Asakura syoten, Tokyo, Japan. (2000)

    • Description
      「研究成果報告書概要(欧文)」より

URL: 

Published: 2003-09-17  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi