• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to previous page

Generalization Capability of Memorization Leaning

Research Project

Project/Area Number 11480072
Research Category

Grant-in-Aid for Scientific Research (B)

Allocation TypeSingle-year Grants
Section一般
Research Field Intelligent informatics
Research InstitutionTokyo Institute of Technology

Principal Investigator

OGAWA Hidemitsu  Tokyo Institute of Technology, Department of Computer Science, Professor, 大学院・情報理工学研究科, 教授 (50016630)

Co-Investigator(Kenkyū-buntansha) SUGIYAMA Masashi  Tokyo Institute of Technology, Department of Computer Science, Research Associate, 大学院・情報理工学研究科, 助手 (90334515)
HIRABAYASHI Akira  Yamaguchi University, Department of Computer Science and Systems Engineering, Lecturer, 工学部, 講師 (50272688)
KUMAZAWA Itsuo  Tokyo Institute of Technology, Department of Computer Science, Associate Professor, 大学院・情報理工学研究科, 助教授 (70186469)
Project Period (FY) 1999 – 2001
Project Status Completed (Fiscal Year 2001)
Budget Amount *help
¥13,800,000 (Direct Cost: ¥13,800,000)
Fiscal Year 2001: ¥3,100,000 (Direct Cost: ¥3,100,000)
Fiscal Year 2000: ¥4,900,000 (Direct Cost: ¥4,900,000)
Fiscal Year 1999: ¥5,800,000 (Direct Cost: ¥5,800,000)
Keywordssupervised learning / generalization capability / error back-propagation / memorization learning / admissibility / a family of projection learnings / subspace information criterion / SIC / 適用範囲 / ニューラルネットワーク / 過学習 / 射影学習
Research Abstract

The purpose of supervised learning is to be able to answer correctly to queries that are not necessarily included in the training examples, i.e., to acquire a higher level of the generalization capability. However, most of the learning methods such as the error back-propagation are the so-called memorization learning, which is aimed at reducing the error only for training examples. Therefore, there is no theoretical guarantee for optimal generalization.
This gives rise to the following problems: First is to clarify the reason why a higher level of the generalization capability can be acquired by the memorization learning despite the fact that it does not require the generalization capability. The second problem is to clarify the range of applicability that: the memorization learning works effectively. Third is to develop methods for further expanding the range of applicability.
For the first problem, we gave a lucid explanation by introducing the concept of admissibility. For the second … More problem, we introduced the concept of a family of projection learnings which allows us to theoretically discuss an infinitely many kinds of learning methods simultaneously. Utilizing the concepts of admissibility and a family of projection learning, we clarified the range of applicability of the memorization learning in the narrow sense with respect to a family of projection learnings. For the third problem, we showed that there are a large number of solutions : We extended the concept of the memorization learning from the rote memorization learning to the error corrected memorization learning, which further enlarges the range of applicability. From the view point of active learning, we gave design methods of training examples that maximally enhance the generalization capability. Furthermore, from the standpoint of model selection, we proposed the subspace information criterion (SIC) , which is a model selection criterion with its effectiveness theoretically guaranteed for a finite number of training examples. Based on SIC, we gave, for example, a design method of the optimal regularization parameter. Less

Report

(4 results)
  • 2001 Annual Research Report   Final Research Report Summary
  • 2000 Annual Research Report
  • 1999 Annual Research Report
  • Research Products

    (64 results)

All Other

All Publications (64 results)

  • [Publications] MasashiSugiyama, Hidemitsu Ogawa: "Active learning for optimal generalization in trigonometric polynomial models"IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences. E84-A. 2319-2329 (2001)

    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2001 Final Research Report Summary
  • [Publications] Akiko Nakashima, Akira Hirabayashi, Hidemitsu Ogawa: "Error correcting memorization learning for noisy training examples"Neural Networks. 14. 79-92 (2001)

    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2001 Final Research Report Summary
  • [Publications] Masashi Sugiyama, Hidemitsu Ogawa: "Incremental projection learning for optimal generalization"Neural Networks. 14. 53-66 (2001)

    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2001 Final Research Report Summary
  • [Publications] Masashi Sugiyama, Hidemitsu Ogawa: "Subspace information criterion for model selection"Neural Computation. 13. 1863-1889 (2001)

    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2001 Final Research Report Summary
  • [Publications] Masashi Sugiyama, Hidemitsu Ogawa: "Incremental active learning for optimal generalization"Neural Computation. 12. 2909-2940 (2000)

    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2001 Final Research Report Summary
  • [Publications] 平林晃, 小川英光: "射影学習族"電子情報通信学会論文誌(D-II). 12. 754-767 (2000)

    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2001 Final Research Report Summary
  • [Publications] 小川英光: "脳科学大事典(甘利俊一, 外山敬介編)"朝倉書店. 1006 (2000)

    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2001 Final Research Report Summary
  • [Publications] Masashi Sugiyama and Hidemitsu Ogawa: "Active learning for optimal generalization in trigonometric polynomial models"IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences. vol.E84-A, no.9. 2319-2329 (2001)

    • Description
      「研究成果報告書概要(欧文)」より
    • Related Report
      2001 Final Research Report Summary
  • [Publications] Akiko Nakashima, Akira Hirabayashi, and Hidemitsu Ogawa: "Error correcting memorization learning for noisy training examples"Neural Networks. vol.14, no.1. 79-92 (2001)

    • Description
      「研究成果報告書概要(欧文)」より
    • Related Report
      2001 Final Research Report Summary
  • [Publications] Masashi Sugiyama and Hidemitsu Ogawa: "Incremental projection learning for optimal generalization"Neural Networks. vol.14, no.1. 53-66 (2001)

    • Description
      「研究成果報告書概要(欧文)」より
    • Related Report
      2001 Final Research Report Summary
  • [Publications] Masashi Sugiyama and Hidemitsu Ogawa: "Subspace information criterion for model selection"Neural Computation. vol.13, no.8. 1863-1889 (2001)

    • Description
      「研究成果報告書概要(欧文)」より
    • Related Report
      2001 Final Research Report Summary
  • [Publications] Masashi Sugiyama and Hidemitsu Ogawa: "Incremental active learning for optimal generalization"Neural Computation. vol.12, no.12. 2909-2940 (2000)

    • Description
      「研究成果報告書概要(欧文)」より
    • Related Report
      2001 Final Research Report Summary
  • [Publications] Akira Hirabayashi and Hidemitsu Ogawa: "A family of projection learnings"IEICE Transactions D-II. vol.J83-D-II, no.2. 754-767 (2000)

    • Description
      「研究成果報告書概要(欧文)」より
    • Related Report
      2001 Final Research Report Summary
  • [Publications] Hidemitsu Ogawa: "Encyclopedia of Brain Science (Shin-ichi Amari and Toyama Keisuke, Eds.)"Asakura syoten, Tokyo, Japan. (2000)

    • Description
      「研究成果報告書概要(欧文)」より
    • Related Report
      2001 Final Research Report Summary
  • [Publications] Masashi Sugiyama, Hidemitsu Ogawa: "Active learning for optimal generalization in trigonometric polynomial models"IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences. E84-A・9. 2319-2329 (2001)

    • Related Report
      2001 Annual Research Report
  • [Publications] Akiko Nakashima, Akira Hirabayashi, Hidemitsu Ogawa: "Error correcting memorization learning for noisy training examples"Neural Networks. 14・1. 79-92 (2001)

    • Related Report
      2001 Annual Research Report
  • [Publications] Masashi Sugiyama, Hidemitsu Ogawa: "Incremental projection learning for optimal generalization"Neural Networks. 14・1. 53-66 (2001)

    • Related Report
      2001 Annual Research Report
  • [Publications] Masashi Sugiyama, Hidemitsu Ogawa: "Subspace information criterion for model selection"Neural Computation. 13・8. 1863-1889 (2001)

    • Related Report
      2001 Annual Research Report
  • [Publications] Masashi Sugiyama, Hidemitsu Ogawa: "Incremental active learning for optimal generalization"Neural Computation. 12・12. 2909-2940 (2000)

    • Related Report
      2001 Annual Research Report
  • [Publications] 平林晃, 小川英光: "射影学習族"電子情報通信学会論文誌(D-II). J83-D-II・2. 754-767 (2000)

    • Related Report
      2001 Annual Research Report
  • [Publications] 小川英光: "脳科学大事典(甘利俊一, 外山敬介編)"朝倉書店. 1006 (2000)

    • Related Report
      2001 Annual Research Report
  • [Publications] H.Ogawa and N.-E.Berrached: "EPBOBs (extended pseudo biorthogonal bases) for signal recovery"IEICE Trans.on Information and Systems. E83-D,no.2. 223-232 (2000)

    • Related Report
      2000 Annual Research Report
  • [Publications] A.Hirabayasi,H.Ogawa,and A.Nakashima: "Realization of admissibility for supervised learning"IEICE Trans.on Information and Systems. E83-D,no.5. 1170-1176 (2000)

    • Related Report
      2000 Annual Research Report
  • [Publications] M.Sugiyama and H.Ogawa: "Incremental active learning for optimal generalization"Neural Computation. 12,no.12. 2909-2940 (2000)

    • Related Report
      2000 Annual Research Report
  • [Publications] M.Sugiyama and H.Ogawa: "Incremental projection learning for optimal generalization"Neural Networks. 14,no.1. 54-66 (2001)

    • Related Report
      2000 Annual Research Report
  • [Publications] M.Sugiyama and H.Ogawa: "Properties of incremental projection learning"Neural Networks. 14,no.1. 67-78 (2001)

    • Related Report
      2000 Annual Research Report
  • [Publications] A.Nakashima,A.Hirabayasi,and H.Ogawa: "Error correcting memorization learning for noisy training examples"Neural Networks. 14,no.1. (2001)

    • Related Report
      2000 Annual Research Report
  • [Publications] M.Sugiyama and H.Ogawa: "Subspace information criterion for model selection"Neural Computation. 13. (2001)

    • Related Report
      2000 Annual Research Report
  • [Publications] M.Sugiyama and H.Ogawa: "Theoretical and experimental evaluation of subspace information criterion"Machine Learning, Special Issuue on New Methods for Model Selection and Model Combination. 41. (2001)

    • Related Report
      2000 Annual Research Report
  • [Publications] A.Nakashima and H.Ogawa: "Noise supression in training examples for improving generalization capability"Neural Networks. 14,. (2001)

    • Related Report
      2000 Annual Research Report
  • [Publications] M.Sugiyama and H.Ogawa: "A new information criterion for the selection of subspace moels"ESANN'2000,8th European Symposium on Artificial Neural Networks,April 26-28,2000,Bruges, Belgium. 69-74 (2000)

    • Related Report
      2000 Annual Research Report
  • [Publications] M.Sugiyama and H.Ogawa: "Incremental active learning with bias reduction"IJCNN'2000,IEEE-INNS-ENNS Int.Joint Conf.on Neural Networks,July 24-27,2000,Como,Italy,. 1. 15-20 (2000)

    • Related Report
      2000 Annual Research Report
  • [Publications] M.Sugiyama and H.Ogawa: "Subspace information criterion-Unbiased generalization error estimator for linear regression"NIPS'2000 Workshop,Cross-Validation, Bootstrap and Model Selection, December 1-2,2000,Breckenridge, Colorado. (2000)

    • Related Report
      2000 Annual Research Report
  • [Publications] M.Sugiyama and H.Ogawa: "Simultaneous optimization of sample points and models"電子情報通信学会ニューロコンピューティング研究会. NC2000-26. 17-24 (2000)

    • Related Report
      2000 Annual Research Report
  • [Publications] M.Sugiyama and H.Ogawa: "Active learning with model selection for optimal generalization"IBIS'2000,2000 Workshop on Information-Based Induction Sciences, July 17-18,2000. 87-92 (2000)

    • Related Report
      2000 Annual Research Report
  • [Publications] 杉山将,小川英光: "最適汎化のための逐次型能動学習"2000年電子情報通信学会総合大会. 6,no.D-2-2. 11 (2000)

    • Related Report
      2000 Annual Research Report
  • [Publications] 山口浩平,杉山将,小川英光: "射影学習による手書き数字認識"2000年電子情報通信学会総合大会. 7,no.D-12-10. 180 (2000)

    • Related Report
      2000 Annual Research Report
  • [Publications] 杉山将,小川英光: "関数の注目点における値の最適推定のためのモデル選択"第3回日本神経科学大会・第10回日本神経回路学会大会合同大会. no.O-123. 197 (2000)

    • Related Report
      2000 Annual Research Report
  • [Publications] 甘利俊一,外山敬介 編: "脳科学大事典"朝倉書店. 1006 (2000)

    • Related Report
      2000 Annual Research Report
  • [Publications] A.Hirabayashi: "Admissibility of memorization learning with respect to projection learning in the presence of noise"IEICE Transactions on Information and Systems. E82-D. 488-496 (1999)

    • Related Report
      1999 Annual Research Report
  • [Publications] 平林晃: "射影学習族"電子情報通信学会論文誌(D-II). J83-D-II. 754-767 (2000)

    • Related Report
      1999 Annual Research Report
  • [Publications] 平林晃: "記憶学習の射影学習族に対する適用可能性"電子情報通信学会論文誌(D-II). J83-D-II. 768-775 (2000)

    • Related Report
      1999 Annual Research Report
  • [Publications] 岩城秀和: "単一断線故障を修復可能な最適汎化ニューラルネットワーク"電子情報通信学会論文誌(D-II). J83-D-II. 805-813 (2000)

    • Related Report
      1999 Annual Research Report
  • [Publications] M.Sugiyama: "Exact incremental projection learning in the presence of noise"Proc.of the 11th Scandinavian Conference on Image Analysis. 747-754 (1999)

    • Related Report
      1999 Annual Research Report
  • [Publications] A.Hirabayashi: "A class of learning for optimal generalization"Proc.of IJCNN'99,1999 International Joint Conference on Neural Networks. no.246 (CD-ROM). (1999)

    • Related Report
      1999 Annual Research Report
  • [Publications] A.Hirabayashi: "What can memorization learning do?"Proc.of IJCNN'99,1999 International Joint Conference on Neural Networks. no.247(CD-ROM). (1999)

    • Related Report
      1999 Annual Research Report
  • [Publications] M.Sugiyama: "Pseudo orthogonal bases give the optimal generalization capability in neural network learning"Proc.of the 7th Conference on Wavelet Applications in Signal and Image Processing. 3813. 526-537 (1999)

    • Related Report
      1999 Annual Research Report
  • [Publications] M.Sugiyama: "Functional analytic approach to model selection-subspace information criterion"Proc.of IBIS'99,1999 Workshop on information-Based Induction Sciences. 93-98 (1999)

    • Related Report
      1999 Annual Research Report
  • [Publications] A.Hirabayashi: "Realization of admissibility of memorization learning with respect to projection learning"Proc.of LWA'99,Lernen,Wissensentdeckung und Adaptivitat. 別冊. 20-31 (1999)

    • Related Report
      1999 Annual Research Report
  • [Publications] A.Nakashima: "How to design a regularization term for improving generalization"Proc.of ICONIP'99,6th International Conference on Neural Information Processing. 1. 222-227 (1999)

    • Related Report
      1999 Annual Research Report
  • [Publications] A.Hirabayashi: "What can memorization learning do from noisy training examples?"Proc.of ICONIP'99,6th International Conference on Neural Information Processing. 1. 228-233 (1999)

    • Related Report
      1999 Annual Research Report
  • [Publications] A.Hirabayashi: "Projection learning of the minimum variance type"Proc.of ICONIP'99,6th International Conference on Neural Information Processing. 3. 1172-1177 (1999)

    • Related Report
      1999 Annual Research Report
  • [Publications] 杉山将: "Exact incremental projection learning in neural networks"電子情報通信学会ニューロコンピューティング研究会技術研究報告. NC98-97. 149-156 (1999)

    • Related Report
      1999 Annual Research Report
  • [Publications] 西英治: "最適汎化のための射影学習族の追加学習"電子情報通信学会ニューロコンピューティング研究会技術研究報告. NC99-55. 7-14 (1999)

    • Related Report
      1999 Annual Research Report
  • [Publications] 杉山将: "Incremental active learning in consideration of bias"電子情報通信学会ニューロコンピューティング研究会技術研究報告. NC99-56. 15-22 (1999)

    • Related Report
      1999 Annual Research Report
  • [Publications] 杉山将: "Bias estimation and model selection"電子情報通信学会ニューロコンピューティング研究会技術研究報告. NC99-81. 9-16 (2000)

    • Related Report
      1999 Annual Research Report
  • [Publications] 須賀啓敏: "適応型射影追加学習"電子情報通信学会ニューロコンピューティング研究会技術研究報告. NC99-82. 17-24 (2000)

    • Related Report
      1999 Annual Research Report
  • [Publications] 中島朗子: "最良線形不偏推定の拡張としての射影学習"電子情報通信学会1999年総合大会講演論文集. 6. 31-31 (1999)

    • Related Report
      1999 Annual Research Report
  • [Publications] 平林晃: "射影学習族と最小分散型射影学習"電気情報通信学会1999年総合大会講演論文集. 6. 32-32 (1999)

    • Related Report
      1999 Annual Research Report
  • [Publications] 杉山将: "三角多項式ニューラルネットワークの能動学習"電子情報通信学会1999年総合大会講演論文集. 6. 33-33 (1999)

    • Related Report
      1999 Annual Research Report
  • [Publications] 岩城秀和: "単一断線故障が生じたニューラルネットワークの最適汎化能力の復元"電気情報通信学会1999年総合大会講演論文集. 6. 34-34 (1999)

    • Related Report
      1999 Annual Research Report
  • [Publications] 中島朗子: "最適正則化学習"日本神経回路学会第9回全国大会講演論文集. 157-158 (1999)

    • Related Report
      1999 Annual Research Report
  • [Publications] 杉山将: "部分空間モデルの選択"日本神経回路学会第9回全国大会講演論文集. 175-176 (1999)

    • Related Report
      1999 Annual Research Report
  • [Publications] S.A.Solla(Eds): "Advances in Neural Information Processing System12"MIT Press. 1070 (2000)

    • Related Report
      1999 Annual Research Report

URL: 

Published: 1999-04-01   Modified: 2016-04-21  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi