Tensor Learning Team Seminar (Talk by Dr. Patrick Gelß, Zuse Institute Berlin).
Title: The tensor-train format and its applications
Speaker: Dr. Patrick Gelß, Zuse Institute Berlin, Germany.
The simulation and analysis of high-dimensional problems is often infeasible due to the curse of dimensionality. In this talk, we investigate the potential of tensor decompositions for mitigating this curse when considering systems from several application areas. Using tensor-based solvers, we directly compute numerical solutions of high-dimensional master equations and derive algorithms for the data-driven analysis of complex dynamical systems. Furthermore, we show that tensor decompositions can be used for supervised-learning problems such as system recovery and image classification. Since the main focus of our current research lies on applications in the field of quantum computing, we will take a closer look at the corresponding framework for constructing and simulating quantum algorithms. The results show that the tensor-train format enables us to compute low-rank approximations for various numerical problems as well as to reduce the memory consumption and the computational costs compared to classical approaches significantly. We demonstrate that tensor decompositions are a powerful tool for solving high-dimensional problems from various application areas.
Bio: I am a Postdoc at the Zuse Institute Berlin, working in the department for AI in Society, Science, and Technology together with Sebastian Pokutta. I lead the research group on Quantum Computing & Optimization, which is engaged in the development and application of quantum optimization algorithms and tensor-based methods. I studied mathematics and physics and defended my PhD thesis in 2017 at the Freie Universität Berlin. Since then I have been working on different projects on complex dynamical systems, transfer operator theory, and quantum information. My scientific research is focussed on tensor decompositions, aiming at the development and application of novel concepts and techniques for various tasks involving high-dimensional state spaces.
|Date||October 12, 2022 (Wed) 16:00 - 17:00|