• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to previous page

Learning and understanding based on neural network trees

Research Project

Project/Area Number 17500148
Research Category

Grant-in-Aid for Scientific Research (C)

Allocation TypeSingle-year Grants
Section一般
Research Field Sensitivity informatics/Soft computing
Research InstitutionUniversity of Aizu

Principal Investigator

ZHAO Qiangfu  University of Aizu, School of Computer Science and Engineering, Professor, コンピュータ理工学部, 教授 (90260421)

Co-Investigator(Kenkyū-buntansha) LIU Yong  University of Aizu, School of Computer Science and Engineering, Associate Professor, コンピュータ理工学部, 助教授 (60325967)
Project Period (FY) 2005 – 2006
Project Status Completed (Fiscal Year 2006)
Budget Amount *help
¥2,400,000 (Direct Cost: ¥2,400,000)
Fiscal Year 2006: ¥700,000 (Direct Cost: ¥700,000)
Fiscal Year 2005: ¥1,700,000 (Direct Cost: ¥1,700,000)
KeywordsMachine learning / Pattern recognition / Learning and understanding / Neural network / Nearest neighbor classifier / Neural network tree / Multivariate decision tree / Decision tree / 多変量決定木
Research Abstract

In machine learning, there are roughly two types of models, namely symbolic and non-symbolic. The former is comprehensible but not good at learning in changing environment. The latter, on the other hand, is good at learning but the learned results are not comprehensible. The purpose of this research is to propose a way that can learn and understand simultaneously, by combining neural network (NN) and decision tree (DT). The learning model used in this research is called the neural network tree (NNTree). The main problems in using NNTrees are that the induction cost is high and the results are not comprehensible.
In the first year of this project, we proposed a heuristic method for defining the teacher signals for the data assigned to an internal node of the tree. Based on this method, NNs in the internal nodes can be trained using supervised learning, rather than evolutionary learning, as we did before. This can reduce the cost for induction greatly. To increase the comprehensibility of … More the NNTrees, we proposed to use a nearest neighbor classifier (NNC) in each internal node instead of an NN. We call this model the NNC-Tree. In fact, an NNC can provide very comprehensible decision rules if we consider each prototype as a "precedent". To design the NNCs in the internal nodes, we can use the R4-rule proposed by Zhao earlier. Experimental results show that the proposed method can induce accurate, compact, and comprehensible NNC-Trees.
In the second year, we proposed two methods for reducing the induction cost. The first method is the "attentional learning method", and the second is the "dimensionality reduction method". In the first method, we pay more attention to difficult data and skip easy data during the R4-rule based learning. This can reduce more than 80% of the cost for NNC-Tree induction. In the second method, we try to reduce the dimensionality of the problem using principal component analysis. If the dimensionality of the original problem space is very high (e.g., image recognition), this method can also reduce the cost greatly.
In the future, we would like to apply NNC-Trees to solve problems with incomplete data (data with missing attributes). We would also like to consider the importance and costs of the attributes during induction of the tree. Further, we would like to visualize the induction results, and try to make the NNC-Tree more comprehensible. Less

Report

(3 results)
  • 2006 Annual Research Report   Final Research Report Summary
  • 2005 Annual Research Report
  • Research Products

    (25 results)

All 2006 2005

All Journal Article (23 results) Patent(Industrial Property Rights) (2 results)

  • [Journal Article] Inducing NNC-Trees with the R4-rule2006

    • Author(s)
      Q.F.Zhao
    • Journal Title

      IEEE Trans. on Systems, Man, and Cybernetics - Part B 36, 3

      Pages: 520-533

    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2006 Final Research Report Summary
  • [Journal Article] Inducing NNC-Trees quickly2006

    • Author(s)
      Q.F.Zhao
    • Journal Title

      Proe. IEEE International Conference on Systems, Man, and Cybernetics (SMC06)

      Pages: 2784-2789

    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2006 Final Research Report Summary
  • [Journal Article] Evolving NNTrees more efficiently2006

    • Author(s)
      H.Hayashi, Q.F.Zhao
    • Journal Title

      Proc. IEEE International Congress on Evolutionary Computation (CEC06)

      Pages: 2638-2643

    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2006 Final Research Report Summary
  • [Journal Article] Inducing NNC-Trees with the R4-rule2006

    • Author(s)
      Q.F.Zhao
    • Journal Title

      IEEE Trans. on Systems, Man, and Cybernetics-Part B Vol.36, No.3

      Pages: 520-533

    • Description
      「研究成果報告書概要(欧文)」より
    • Related Report
      2006 Final Research Report Summary
  • [Journal Article] Inducing NNC-Trees quickly2006

    • Author(s)
      Q.F.Zhao
    • Journal Title

      Proc.IEEE International Conference on Systems, Man and Cybernetics

      Pages: 2784-2789

    • Description
      「研究成果報告書概要(欧文)」より
    • Related Report
      2006 Final Research Report Summary
  • [Journal Article] Evolving NN Trees more efficiently2006

    • Author(s)
      H.Hayashi, Q.F.Zhao
    • Journal Title

      Proc.IEEE International Congress on Evolutionary Computation, Vancouver, BC, Canada

      Pages: 2638-2643

    • Description
      「研究成果報告書概要(欧文)」より
    • Related Report
      2006 Final Research Report Summary
  • [Journal Article] Inducing NNC-Trees with the R4-rule2006

    • Author(s)
      Qiangfu Zhao
    • Journal Title

      IEEE Trans. on Systems, Man, and Cybernetics - Part B 36, 3

      Pages: 520-533

    • Related Report
      2006 Annual Research Report
  • [Journal Article] Inducing NNC-Trees quickly2006

    • Author(s)
      Qiangfu Zhao
    • Journal Title

      Proc. IEEE International Conference on Systems, Man, and Cybernetics (SMC06)

      Pages: 2784-2789

    • Related Report
      2006 Annual Research Report
  • [Journal Article] Evolving NNTrees more efficiently2006

    • Author(s)
      Hirotomo Hayashi, Qiangfu Zhao
    • Journal Title

      Proc. IEEE International Congress on Evolutionary Computation (CEC06)

      Pages: 2638-2643

    • Related Report
      2006 Annual Research Report
  • [Journal Article] Inducing NNC-Trees with R4-rule2006

    • Author(s)
      Qiangfu ZHAO
    • Journal Title

      IEEE Trans.on SMC-B Vol.36, NO.3(to appear)

    • Related Report
      2005 Annual Research Report
  • [Journal Article] Incremental relearning with neural network trees2005

    • Author(s)
      T.Takeda, Q.F.Zhao, Y.Liu
    • Journal Title

      Neural, Parallel and Scientific Computations 13, 3/4

      Pages: 287-296

    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2006 Final Research Report Summary
  • [Journal Article] Learning with data streams - an NNTree based approach2005

    • Author(s)
      Q.F.Zhao
    • Journal Title

      Lecture Notes in Computer Science, Springer (Proc. International Symposium on Ubiquitous Intelligence and Smart World) 3823

      Pages: 519-528

    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2006 Final Research Report Summary
  • [Journal Article] A comparative study on GA based and BP based induction of neural network trees2005

    • Author(s)
      T.Hayashi, Q.F.Zhao
    • Journal Title

      Proc. IEEE International Conference on Systems, Man, and Cybernetics (SMC05)

      Pages: 822-826

    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2006 Final Research Report Summary
  • [Journal Article] Inducing multivariate decision trees with the R4-rule2005

    • Author(s)
      T.Kawatsure, Q.F.Zhao
    • Journal Title

      Proc. IEEE International Conference on Systems, Man, and Cybernetics (SMC05)

      Pages: 3593-3598

    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2006 Final Research Report Summary
  • [Journal Article] Lifetime learning with neural network trees2005

    • Author(s)
      Q.F.Zhao
    • Journal Title

      Proc. 8^<th> International Conference on Pattern Recognition and Information Processing (PRIP05), Keynote Speech

      Pages: 359-362

    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2006 Final Research Report Summary
  • [Journal Article] Incremental learning with neural network trees2005

    • Author(s)
      T.Takeda, Q.F.Zhao, Y.Liu
    • Journal Title

      Neural, Parallel and Scientific Computations Vol.13, No. 3/4

      Pages: 287-296

    • Description
      「研究成果報告書概要(欧文)」より
    • Related Report
      2006 Final Research Report Summary
  • [Journal Article] Learning with data streams-an NNTree based approach2005

    • Author(s)
      Q.F.Zhao
    • Journal Title

      Proc.2nd International Symposium on Ubiquitous Intelligence and Smart Worlds, Nagasaki, (Lecture Notes in Computer Science 3823, Springer).

      Pages: 519-528

    • Description
      「研究成果報告書概要(欧文)」より
    • Related Report
      2006 Final Research Report Summary
  • [Journal Article] A comparative study on GA based and BP based induction of neural network trees2005

    • Author(s)
      H.Hayashi, Q.F.Zhao
    • Journal Title

      Proc.IEEE International Conference on Systems, Man and Cybernetics, Waikoloa, Hawaii

      Pages: 822-826

    • Description
      「研究成果報告書概要(欧文)」より
    • Related Report
      2006 Final Research Report Summary
  • [Journal Article] Inducing multivariate decision trees with the R4-rule2005

    • Author(s)
      T.Kawatsure, Q.F.Zhao
    • Journal Title

      Proc.IEEE International Conference on Systems, Man and Cybernetics, Waikoloa, Hawaii

      Pages: 3593-3598

    • Description
      「研究成果報告書概要(欧文)」より
    • Related Report
      2006 Final Research Report Summary
  • [Journal Article] Lifetime learning with neural network trees,2005

    • Author(s)
      Q.F.Zhao
    • Journal Title

      Proc.8th International Conference on Pattern Recognition and Information Processing, Keynote Speech

      Pages: 359-362

    • Description
      「研究成果報告書概要(欧文)」より
    • Related Report
      2006 Final Research Report Summary
  • [Journal Article] Incremental Learning with the Neural Network Trees2005

    • Author(s)
      T.Takeda, Q.F.Zhao, Y.Liu
    • Journal Title

      Neural, Parallel and Scientific Computations Vol.13, No.3-4

      Pages: 287-296

    • Related Report
      2005 Annual Research Report
  • [Journal Article] A comparative study on GA based and BP based induction of neural network trees2005

    • Author(s)
      H.Hayashi, Q.F.Zhao
    • Journal Title

      Proc.IEEE International Conference on Systems, Man and Cybernetics

      Pages: 822-826

    • Related Report
      2005 Annual Research Report
  • [Journal Article] Inducing multivariate decision trees with the R4-rule2005

    • Author(s)
      T.Kawatsure, Q.F.Zhao
    • Journal Title

      Proc.IEEE International Conference on Systems, Man and Cybernetics

      Pages: 3593-3598

    • Related Report
      2005 Annual Research Report
  • [Patent(Industrial Property Rights)] 多変数決定木構築システム、多変数決定木構築方法および多変数決定木を構築するためのプログラム2006

    • Inventor(s)
      趙 強福
    • Industrial Property Rights Holder
      会津大学
    • Industrial Property Number
      2006-034343
    • Filing Date
      2006-02-10
    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2006 Annual Research Report 2006 Final Research Report Summary
  • [Patent(Industrial Property Rights)] 多変数テスト関数生成装置、多変数テスト関数生成システム、多変数テストかすう生成方法および多変数テスト関数生成するためのプログラム2006

    • Inventor(s)
      趙 強福
    • Industrial Property Rights Holder
      会津大学
    • Industrial Property Number
      2006-034344
    • Filing Date
      2006-02-10
    • Description
      「研究成果報告書概要(和文)」より
    • Related Report
      2006 Annual Research Report 2006 Final Research Report Summary

URL: 

Published: 2005-04-01   Modified: 2021-04-07  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi