June 4, 2025 08:45
ABI Team Seminar 20250519 talk by Jia-Jie Zhu: Computational Gradient Flows thumbnails

Description

DATE TIME: Monday, May 19th, 11:30 – 13:00,
LOCATION: RIKEN AIP Nihonbashi Office, Open Space

TITLE:
Computational Gradient Flows: the Analysis of Minimizing Relative Entropy for Inference, Sampling, and Optimization

BIO:
Jia-Jie Zhu is a machine learner, applied mathematician, and research group leader at the Weierstrass Institute, Berlin. Previously, he worked as a postdoctoral researcher in machine learning at the Max-Planck-Institute for Intelligent Systems, Tübingen, and received his Ph.D. training in optimization, at the University of Florida, USA. He is interested in the intersection of machine learning, analysis, and optimization, on topics such as (PDE) gradient flows of probability measures, optimal transport, and robustness of learning and optimization algorithms.

ABSTRACT:
Many problems in machine learning can be framed as optimization problems that minimize the Kullback–Leibler divergence between two probability measures. Such problems appear in sampling, variational inference, generative modeling, and reinforcement learning, etc. In this talk, I will focus on the computational aspects of KL-minimization, building upon recent advances in the mathematical foundation of optimal transport and PDE analysis, specifically on the Hellinger-Kantorovich (a.k.a. Wasserstein-Fisher-Rao) gradient flows and the associated functional inequalities. I will then present concrete computational algorithms derived from WFR gradient flows, with applications to sampling and inference. The analysis also showcase a deepened understanding of the connection between the Fisher-Rao gradient flows and kernel methods for machine learning.