2020/8/13 14:59

要旨

This is an online seminar. Registration is required.
https://c5dc59ed978213830355fc8978.doorkeeper.jp/events/110505
We’ll send the instruction for attending the online seminar.

【Date】  2020-09-01 15:00-16:00
【Speaker】 Mr. Dmitry KOPITKOV
【Title】 General Probabilistic Surface Optimization

【Abstract】 Probabilistic inference, such as density (ratio) estimation, is a fundamental and highly important problem that needs to be solved in many different domains including robotics and computer science. Recently, a lot of research was done to solve it by producing various objective functions optimized over neural network (NN) models. Such Deep Learning (DL) based approaches include unnormalized and energy models, as well as critics of Generative Adversarial Networks, where DL has shown top approximation performance. In this research we contribute a novel algorithm family, which generalizes all above, and allows us to infer different statistical modalities (e.g. data likelihood and ratio between densities) from data samples. The proposed unsupervised technique, named Probabilistic Surface Optimization (PSO), views a model as a flexible surface which can be pushed according to loss-specific virtual stochastic forces, where a dynamical equilibrium is achieved when the pointwise forces on the surface become equal. Concretely, the surface is pushed up and down at points sampled from two different distributions, with overall up and down forces becoming functions of these two distribution densities and of force intensity magnitudes defined by the loss of a particular PSO instance. Upon convergence, the force equilibrium associated with the Euler-Lagrange equation of the loss enforces an optimized model to be equal to various statistical functions, such as data density, depending on the used magnitude functions. Furthermore, this dynamical-statistical equilibrium is extremely intuitive and useful, providing many implications and possible usages in probabilistic inference. We connect PSO to numerous existing statistical works which are also PSO instances, and derive new PSO-based inference methods as demonstration of PSO exceptional usability. Additionally, we investigate the impact of Neural Tangent Kernel (NTK) on PSO equilibrium. Our study of NTK dynamics during the learning process emphasizes the importance of the model kernel adaptation to the specific target function for a good learning approximation.

詳細情報

日時 2020/09/01(火) 15:00 - 16:00
URL https://c5dc59ed978213830355fc8978.doorkeeper.jp/events/110505