2020/12/15 19:16

要旨

録画ビデオ

Functional Analytic Learning Unit https://aip.riken.jp/labs/generic_tech/funct_anl_learn/

Speaker 1: Minh Ha Quang
Title: Geometry of Gaussian measures, covariance operators, and Gaussian processes – from information geometry to optimal transport
Abstract: Gaussian measures, Gaussian processes, covariance matrices and operators play important roles in many areas of mathematics and statistics, with numerous applications in various different fields of science and engineering,including machine learning, brain imaging, and computer vision. A lot of recent research success has resulted from the exploitation of the intrinsic non-Euclidean geometrical structures associated with Gaussian measures and covariance matrices. In this talk, we first present a survey of some of the recent developments in the generalization of the geometrical structures of covariance matrices and finite-dimensional Gaussian measures to the setting of infinite-dimensional covariance operators and Gaussian processes. These include the Fisher-Rao distance from information geometry, Log-Hilbert-Schmidt distance, and Log-Determinant divergences, including Kullback-Leibler divergence. In the second half of the talk, we focus on the geometrical structures arising from the theory of optimal transport (OT), including in particular the entropic regularization formulation. We discuss recent results on the entropic regularization of OT for Gaussian measures on Euclidean space and present its generalization to the setting of infinite-dimensional Gaussian measures on Hilbert space and Gaussian processes. In these settings, the regularized Wasserstein distances/divergences admit closed form expressions, which satisfy many favorable theoretical properties in comparison with the exact distance. In the RKHS setting, the regularized distances/divergences are expressed explicitly in terms of Gram matrices and interpolate between the Maximum Mean Discrepancy (MMD) and kernelized Wasserstein distance, with the Sinkhorn divergence having similar dimension-independent finite sample complexity as the MMD. The mathematical formulations will be accompanied by numerical experiments in computer vision.

Speaker 2: Jean Carlo Guella
Title: Characteristic kernels on metric spaces of negative type

Abstract: Two sample tests based on Maximum mean discrepancy are a widely used tool in machine learning, however, its definition depends on a positive definite characteristic kernel in which only a few examples are known, mostly on Euclidean spaces or spheres. In this talk, I will present some new examples of characteristic kernels on metric spaces of negative type, with an emphasis on hyperbolic spaces.

Speaker 3: Hoang Nguyen
Title: Functions on Graph with Lovasz Vectors
Abstract: We study the graph classification problem from the graph homomorphism
perspective. We consider the homomorphisms from F to G, where G is a
graph of interest (e.g. molecules or social networks) and F belongs to
some family of graphs (e.g. paths or non-isomorphic trees). We show
that graph homomorphism numbers provide a natural invariant
(isomorphism invariant and F-invariant) embedding maps which can be
used for graph classification. Viewing the expressive power of a graph
classifier by the F-indistinguishable concept, we prove the
universality property of graph homomorphism vectors in approximating
F-invariant functions. In practice, by choosing F whose elements have
bounded treewidth, we show that the homomorphism method is efficient
compared with other methods.

詳細情報

日時 2021/01/27(水) 15:00 - 17:00
URL https://c5dc59ed978213830355fc8978.doorkeeper.jp/events/113722

関連研究室

last updated on 2024/3/15 11:56研究室