Prof. Marco Cuturi (http://www.marcocuturi.net/) from ENSAE / CREST (France) will be visiting AIP on April 17 and 18th, 2017. He will be available for discussion, so please let me know if you want to meet him (email me at firstname.lastname@example.org).
We will have a follow up talk by Prof. Shunichi Amari on “Wasserstein Statistics”.
Prof. Marco Cuturi ===========================================================
Learning with regularized distances: the case of optimal transport and dynamic time warping.
I will present two different yet related discrepancy functions between structured datatypes: dynamic time warping (DTW) divergence for time series, and optimal transport distances (a.k.a Wasserstein or EMD) for probability measures. Both are defined as the result of an optimization problem, a dynamic program for DTW and a network flow problem for EMD. I will show in this talk that both optimization problems can be regularized, and that such a regularization have several benefits. For EMD, an entropic regularization results in algorithms that are orders of magnitude faster. For both EMD and DTW, regularized versions yield divergences that are differentiable, and whose gradients can be efficiently approximated (EMD) or exactly computed (DTW) using automated differentiation techniques. We will show how this can be used to compute barycenters of histograms, point clouds, and times series, carry out more advanced unsupervised learning such as dictionary learning or dimensionality reduction, as well as learn structured output inference tools using these differentiable divergences as learning losses.
Prof. Shunichi Amari ===========================================================
Divergences which connect the entropy-relaxed Wasserstein distance and the KL-divergence: Preliminary studies
I am deeply inspired from Cuturi’s works on the entropy-relaxed W distance. The present talk is an informal and preliminary ideas, not well-done but half baked, to introduce a new type of divergences which connect the W-distance and the KL-divergence. The talk includes possibility of constructing unification of W-statistics and KL-based likelihood statistics.
|日時||2017/04/18(火) 11:00 - 13:00|