2021 Fiscal Year Annual Research Report
Tensor Network Representation for Machine Learning: Theoretical Study and Algorithms Development
Project/Area Number |
20H04249
|
Research Institution | Institute of Physical and Chemical Research |
Principal Investigator |
ZHAO QIBIN 国立研究開発法人理化学研究所, 革新知能統合研究センター, チームリーダー (30599618)
|
Co-Investigator(Kenkyū-buntansha) |
曹 建庭 埼玉工業大学, 工学部, 教授 (20306989)
横田 達也 名古屋工業大学, 工学(系)研究科(研究院), 准教授 (80733964)
|
Project Period (FY) |
2020-04-01 – 2024-03-31
|
Keywords | tensor network |
Outline of Annual Research Achievements |
We have developed several new tensor network decomposition and completion algorithms, and also developed the tensor network based neural network models and learning algorithms. These methods have been applied to several computer vision tasks.
Specifically, we have developed tensorized RNN model that can achieve long term memory and reduced model size; we also studied Bayesian latent factor models to understand how tensor network is able to achieve model compression; our proposed tensor fusion layer can be applied to image denoting tasks with improvement performance, which can be also applied to the development of multimodal sentimental analysis. We also developed an efficient algorithm for classification on incomplete data samples, which has practical applications when the high-quality dataset is difficult to be obtained.
From theoretical perspective, we have studied tensor nuclear norm and proposed several new definition of tensor norm, which has guarantee for exact recovery to tensor. In addition, we proposed a new type tensor network, called fully connected tensor network, which shows great flexibility on modeling complex interaction between tensor modes. The effectiveness of our theory and model is validated extensively on tensor completion tasks.
|
Current Status of Research Progress |
Current Status of Research Progress
2: Research has progressed on the whole more than it was originally planned.
Reason
Some experimental evaluation are postponed.
|
Strategy for Future Research Activity |
We will further study tensor network representation ability when applied to neural networks.
|