要旨
This talk will be held in a hybrid format, both in person at AIP Open Space of RIKEN AIP (Nihonbashi office) and online by Zoom. AIP Open Space: *only available to AIP researchers.
DATE, TIME & LOCATION
Monday, May 19th, 11:30 – 13:00, RIKEN AIP Nihombashi Office, Open Space
TITLE
Computational Gradient Flows: the Analysis of Minimizing Relative Entropy for Inference, Sampling, and Optimization
BIO
Jia-Jie Zhu (https://jj-zhu.github.io/) is a machine learner, applied mathematician, and research group leader at the Weierstrass Institute, Berlin. Previously, he worked as a postdoctoral researcher in machine learning at the Max-Planck-Institute for Intelligent Systems, Tübingen, and received his Ph.D. training in optimization, at the University of Florida, USA. He is interested in the intersection of machine learning, analysis, and optimization, on topics such as (PDE) gradient flows of probability measures, optimal transport, and robustness of learning and optimization algorithms.
ABSTRACT
Many problems in machine learning can be framed as optimization problems that minimize the Kullback–Leibler divergence between two probability measures. Such problems appear in sampling, variational inference, generative modeling, and reinforcement learning, etc. In this talk, I will focus on the computational aspects of KL-minimization, building upon recent advances in the mathematical foundation of optimal transport and PDE analysis, specifically on the Hellinger-Kantorovich (a.k.a. Wasserstein-Fisher-Rao) gradient flows and the associated functional inequalities. I will then present concrete computational algorithms derived from WFR gradient flows, with applications to sampling and inference. The analysis also showcase a deepened understanding of the connection between the Fisher-Rao gradient flows and kernel methods for machine learning.
詳細情報
日時 | 2025/05/19(月) 11:30 - 13:00 |
URL | https://c5dc59ed978213830355fc8978.doorkeeper.jp/events/184460 |