Mathematical Science Team (https://aip.riken.jp/labs/generic_tech/math_sci/?lang=en) at RIKEN AIP
Speaker 1: (5min) Kenichi Bannai
Title: Overview of the Mathematical Science Team
Mathematical Science Team is a team consisting of pure mathematician and theoretical physicists with aim of attacking mathematical problems arising in artificial intelligence and machine learning. In this talk, we briefly give an overview of the Mathematical Science Team.
Speaker 2: (20 min) Koiichi Tojo
Title: A method to construct exponential families by representation theory
Exponential families play an important role in the field of information geometry and statistics. By definition, there are infinitely many exponential families. However, only a small part of them are widely used. We want to give a framework to deal with these “good” families systematically. In light of the observation that the sample space of most of them are homogeneous spaces of certain Lie groups, we proposed a method to construct exponential families on homogeneous spaces by taking advantage of representation theory in . This method generates widely used exponential families such as normal, gamma, Bernoulli, categorical, Wishart, von Mises-Fisher, and hyperboloid distributions. In this talk, we will explain the method and its properties.
 K. Tojo, T. Yoshino, A method to construct exponential families by representation theory, arXiv:1811.01394v3.
Speaker 3: (20min) Tomotaka Kuwahara
Title: Information-theoretic structure of quantum Boltzmann distribution
Quantum Boltzmann distribution (or quantum Gibbs state) plays crucial roles in quantum algorithms as well as quantum machine learning. Usually, the efficient simulation of the quantum Boltzmann distribution is a computationally hard problem.
Even though, the quantum Boltzmann distribution shows very simple and universal structures in terms of the information theory. In this talk, I will show several of them, in particular, the sample complexity of the quantum Hamiltonian learning based on the quantum Boltzmann distribution.
Speaker 4: (20min) Masahiro Ikeda
Title: Operator-theoretic approach for time-series data generated by nonlinear dynamical system
To construct a metric between two series of data is a fundamental problem in machine learning and pattern recognition. In the first study with Ishikawa, Fujii, Hashimoto and Kawahara, we construct a metric for two time series of data, generated by a “nonlinear” dynamical system.
Given a dynamical system, the Perron-Frobenius operator is defined as a linear operator on a function space. By using the operator on a reproducing kernel Hilbert space (RKHS), we construct the metric, which generalizes several previous results. Moreover, we empirically illustrate our metric with an example of rotation dynamics in a unit disk in a complex plane, and evaluate the performance with real-world time-series data (NeurIPS2018). If time is permitted, I introduce an extension of this result to nonlinear dynamical systems with randomness.
From these studies, a significant Mathematical problem arises, which is boundedness of the operator on RKHS associated with the Gaussian kernel, widely used in practical situations. I introduce our resent result about the boundedness with Ishikawa and Sawano (Arxiv:1911.11992).
Speaker 5: (20min) Akiyoshi Sannai
Title: Deep learning with symmetry
Symmetry is a fundamental concept in mathematics and physics. In machine learning, there are many tasks with symmetry, such as redundancy in labeling point clouds and graphs, and rotational invariance of objects. In order to handle such tasks, deep models that take symmetry into account have been proposed (NeurIPS2017, ICML2020 etc.). In this talk, I will review the construction of symmetric deep models by Zaheer, Maron, and Yarotsky, and explain the mathematical mechanism for deriving the universal approximation theorem.
|Date||March 24, 2021 (Wed) 15:00 - 17:00|