説明
Tensor Learning Team (https://aip.riken.jp/labs/generic_tech/tensor_learn/?lang=en) at RIKEN AIP
Speaker 1 (15:00 – 15:45): Qibin Zhao
Title: Overview of Tensor Networks in Machine Learning
Abstract: In recent years, tensor networks (TNs) have been increasingly applied to machine learning and deep neural networks (DNN). This talk will present an overview of recent progress of TNs technology in machine learning from several aspects including TN for data decomposition, model parameter representation and function representation. Our team conducts research on this topic towards one question whether TNs are possibly developed to be a powerful ML model with new perspectives.
Speaker 2 (15:45 – 16:10): Cesar Caiafa
Title: Sparse tensor representations and applications
Abstract: It has been demonstrated that sparse coding of natural data allows for capturing information efficiently (compression), providing a powerful linear model for signal processing and machine learning tasks. In this talk, I will present a generalization of sparse representations for multidimensional data (tensors). The Sparse Tucker (ST) model provides a computationally efficient tool for classical signal processing as well as to model diffusion-weighted Magnetic Resonance Images (dMRI) providing a valuable new tool for the construction and validation of macroscopic brain connectomes in neuroscience studies. These results were presented in NIPS 2017 and NeurIPS 2019.
Speaker 3 (16:10 – 16:35): Chao Li
Title: Evolutionary Topology Search for Tensor Network Decomposition
Abstract: Tensor diagram notation is a simple yet powerful framework to rigorously formulate tensor network (TN) decomposition using graph in a topological space. In the talk, we introduce a genetic algorithm (GA) to search the (near-)optimal graphical structure of TN decomposition for a given tensor, and use empirical results to demonstrate GA can tackle the issue in an affordable manner. This work has been presented at ICML 2020.
Speaker 4 (16:35 – 17:00): Tatsuya Yokota
Title: Nonnegative Matrix Factorization in Application to Dynamic PET image reconstruncion
Abstract: In this seminar, we will introduce research on applying non-negative matrix factorization technology to the problem of dynamic PET image reconstruction. A noise-robust reconstruction was achieved by a method that successfully combines non-negative constraints, low-rank constraints, image prior of spatial basis, and smoothnes prior of temporal basis. This result was presented at the ICCV 2019.