This is an online event. Registration is required.
Speaker: Guillaume Rabusseau (Assistant Professor, MILA) (https://www.iro.umontreal.ca/~grabus/)
Title: Tensor networks for machine learning: from random projections to learning the structure of tensor networks
Abstract: In this talk, I will present two recent work leveraging the powerful formalism of tensor networks for machine learning problems.
In the first part of the talk, I will present a novel novel random projection technique for efficiently reducing the dimension of very high-dimensional tensors. Building upon classical results on Gaussian random projections and Johnson-Lindenstrauss transforms~(JLT), I will present two tensorized random projection maps relying on the tensor train~(TT) and CP decomposition format, respectively. The two maps offer very low memory requirements and can be applied efficiently when the inputs are low rank tensors given in the CP or TT format. I will also showcase theoretical and experimental results demonstrating that the TT format is substantially superior to CP in terms of the size of the random projection needed to achieve the same distortion ratio.
In the second part of the talk, I will present a simple greedy algorithm leveraging the tensor network formalism to develop a generic and efficient adaptive algorithm for tensor learning. This simple algorithm can adaptively learn the structure of a tensor network with small number of parameters that effectively optimize a given objective function fromdata. I will present experiments on tensor decomposition and tensor completion tasks with both synthetic and real-world data demonstrating the effectiveness of the proposed algorithm.
|October 27, 2020 (Tue) 10:00 - 11:00