November 18, 2021 13:16
Talk by Dr. Ilsang Ohn, a postdoctoral researcher at the University of Notre Dame on Adaptive variational Bayes: Optimality, computation and applications thumbnails

Description

Speaker: Ilsang Ohn https://sites.google.com/view/iohn

Hosted by Pierre Alquier, Approximate Bayesian Inference Team

Title: Adaptive variational Bayes: Optimality, computation and applications

Abstract: In this work, we explore adaptive inference based on variational Bayes. We propose a novel variational Bayes framework, called adaptive variational Bayes, which can operate on a collection of models with varying dimensions and structures. The proposed framework combines variational posteriors over individual models with certain weights to obtain a variational posterior over the entire model. It turns out that this combined variational posterior minimizes the Kullback-Leibler divergence to the original posterior distribution. We show that the proposed variational posterior achieves optimal contraction rates adaptively under very general conditions. We apply the general results obtained for the adaptive variational Bayes to several examples including deep learning models and derive some new and adaptive inference results. Moreover, we consider the use of quasi-likelihood in our framework. We formulate conditions on the quasi-likelihood to ensure the adaptive optimality and discuss specific applications to stochastic block models and nonparametric regression with sub-Gaussian errors

Related Laboratories

last updated on September 18, 2024 09:15Laboratory