Speaker: Jiaxin Shi (post-doctoral researcher at Microsoft Research New England)
Title: Sampling with Mirrored Stein Operators
Accurately approximating an unnormalized distribution with a discrete sample is a fundamental challenge in machine learning and probabilistic inference. Particle evolution methods like Stein variational gradient descent tackle this challenge by applying deterministic updates to particles using operators based on Stein’s method and reproducing kernels to sequentially minimize Kullback-Leibler divergence. However, these methods break down for constrained targets and fails to exploit informative non-Euclidean geometry. In this talk, I will introduce a new family of particle evolution samplers suitable for constrained domains and non-Euclidean geometries. These samplers are derived from a new class of Stein operators and have deep connections with Riemannian Langevin diffusion, mirror descent, and natural gradient descent. We demonstrate that these new samplers yield accurate approximations to distributions on the simplex, deliver valid confidence intervals in post-selection inference, and converge more rapidly than prior methods in large-scale unconstrained posterior inference. Finally, we establish the convergence of our new procedures under verifiable conditions on the target distribution.
Jiaxin Shi is a postdoctoral researcher in the Machine Learning and Statistics group at Microsoft Research New England. His research is focused on developing theory and scalable algorithms to answer challenging inferential questions that arise in probabilistic machine learning. He received his Ph.D. (2020) and B.E. (2015) in Computer Science and Technology from Tsinghua University, advised by Jun Zhu. During his PhD he received a Microsoft Research fellowship in Asia-Pacific region and won a best student paper runner-up award at the 2nd Symposium on Advances in Approximate Bayesian Inference.
|Date||August 30, 2021 (Mon) 12:00 - 13:00|