• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to project page

2017 Fiscal Year Research-status Report

High-Order Deep Learning Models: Theoretical Study and Applications

Research Project

Project/Area Number 17K00326
Research InstitutionInstitute of Physical and Chemical Research

Principal Investigator

ZHAO QIBIN  国立研究開発法人理化学研究所, 革新知能統合研究センター, ユニットリーダー (30599618)

Co-Investigator(Kenkyū-buntansha) 曹 建庭  埼玉工業大学, 工学部, 教授 (20306989)
Project Period (FY) 2017-04-01 – 2020-03-31
KeywordsTensor decomposition / Deep neural network / PU learning / GAN
Outline of Annual Research Achievements

We study tensor based deep learning model and algorithms. In the traditional deep learning methods, each layer is considered as a vector and the connection between layers is considered as a matrix. However, the real-world data is usually represented as a high order tensor. To this end, we formulate the deep learning framework by considering each layer as a tensor and the connection between the layers as multilinear operations based on multiple matrices.
We developed a new tensor based generative adverbial network, which use tensors as input and output, the fully connected layer can be modeled by multilinear product on each tensor mode. The experimental results show that our method can alleviate the mode collapse problem of GAN.
We studied the combination of multiple GANs to perform the important positive unlabeled learning problem. The objective is to learn two generators simutaneously by three discriminator. Each generator can capture one class distribution.
We introduced a new type of tensor decomposition model, which is called tensor ring decomposition. We studied the theoretical ground and the mathematical properties of our proposed model. Then, we developed several algorithms to solve this model. Finally, we applied it to represent the fully connected weight parameters, yielding a significant compression for model complexity.

Current Status of Research Progress
Current Status of Research Progress

2: Research has progressed on the whole more than it was originally planned.

Reason

The project is performing smoothly.

Strategy for Future Research Activity

In the next step, we will investigate how the newly proposed tensor ring decomposition can be applied to deep learning models.

1. We will investigate the low rank representation ability of tensor ring decomposition by applying it to CNN, LSTM, and RNN. The goal is to reduce the model complexity while keeping the same generalization ability.

2. We will study more efficient tensor networks and the corresponding algorithms to model the unknown variables in the general machine learning method. This will allow us to develop new machine learning method with high computational efficiency and compact model complexity.

Causes of Carryover

Some conference will be hold in the next fiscal year. We plan to use it for business trip.

  • Research Products

    (8 results)

All 2018 2017

All Journal Article (3 results) (of which Int'l Joint Research: 3 results,  Peer Reviewed: 3 results) Presentation (5 results) (of which Int'l Joint Research: 5 results)

  • [Journal Article] Correlated Component Analysis for Enhancing the Performance of SSVEP-Based Brain-Computer Interface2018

    • Author(s)
      Zhang Yangsong、Guo Daqing、Li Fali、Yin Erwei、Zhang Yu、Li Peiyang、Zhao Qibin、Tanaka Toshihisa、Yao Dezhong、Xu Peng
    • Journal Title

      IEEE Transactions on Neural Systems and Rehabilitation Engineering

      Volume: 26 Pages: 948~956

    • DOI

      10.1109/TNSRE.2018.2826541

    • Peer Reviewed / Int'l Joint Research
  • [Journal Article] Completion of High Order Tensor Data with Missing Entries via Tensor-Train Decomposition2017

    • Author(s)
      Yuan Longhao、Zhao Qibin、Cao Jianting
    • Journal Title

      Lecture Notes in Computer Science

      Volume: 10634 Pages: 222~229

    • DOI

      https://doi.org/10.1007/978-3-319-70087-8_24

    • Peer Reviewed / Int'l Joint Research
  • [Journal Article] Feature Extraction for Incomplete Data via Low-rank Tucker Decomposition2017

    • Author(s)
      Shi Qiquan、Cheung Yiu-ming、Zhao Qibin
    • Journal Title

      Lecture Notes in Computer Science

      Volume: 10534 Pages: 564~581

    • DOI

      https://doi.org/10.1007/978-3-319-71249-9_34

    • Peer Reviewed / Int'l Joint Research
  • [Presentation] HIGH-ORDER TENSOR COMPLETION FOR DATA RECOVERY VIA SPARSE TENSOR-TRAIN OPTIMIZATION2018

    • Author(s)
      Longhao Yuan, Qibin Zhao, Jianting Cao
    • Organizer
      2018 IEEE International Conference on Acoustics, Speech and Signal Processing
    • Int'l Joint Research
  • [Presentation] Generative Adversarial Positive-Unlabelled Learning2018

    • Author(s)
      Ming Hou, Brahim Chaib-draa, Qibin Zhao
    • Organizer
      27th International Joint Conference on Artificial Intelligence and the 23rd European Conference on Artificial Intelligence
    • Int'l Joint Research
  • [Presentation] TGAN: TENSORIZING GENERATIVE ADVERSARIAL NETS2018

    • Author(s)
      Xingwei Cao, Xuyang Zhao, Qibin Zhao
    • Organizer
      The Third International Conference On Consumer Electronics (ICCE) Asia
    • Int'l Joint Research
  • [Presentation] A Hybrid Brain Computer Interface Based on Audiovisual Stimuli P3002018

    • Author(s)
      Xuyang Zhao, Gaochao Cui, Longhao Yuan, Toshihisa, Tanaka, Qibin Zhao, Jianting Cao
    • Organizer
      The Third International Conference On Consumer Electronics (ICCE) Asia
    • Int'l Joint Research
  • [Presentation] Brain Image Completion by Bayesian Tensor Decomposition2017

    • Author(s)
      Lihua Gui, Qibin Zhao, Jianting Cao
    • Organizer
      2017 22nd International Conference on Digital Signal Processing (DSP)
    • Int'l Joint Research

URL: 

Published: 2018-12-17  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi