Yian Ma : Bridging MCMC and Optimization
In this talk, I will discuss three ingredients of optimization theory in the context of MCMC: Non-convexity, Acceleration, and Stochasticity.
I will focus on a class of non-convex objective functions arising from mixture models. For that class of objective functions, I will demonstrate that the computational complexity of a simple MCMC algorithm scales linearly with the model dimension, while optimization problems are NP-hard.
I will then study MCMC algorithms as optimization over the KL-divergence in the space of measures. By incorporating a momentum variable, I will discuss an algorithm which performs "accelerated gradient descent" over the KL-divergence. Using optimization-like ideas, a suitable Lyapunov function is constructed to prove that an accelerated convergence rate is obtained.
Finally, I will present a general recipe for constructing stochastic gradient MCMC algorithms that translates the task of finding a valid sampler into one of choosing two matrices. I will then describe how stochastic gradient MCMC algorithms can be applied to applications involving temporally dependent data, where the challenge arises from the need to break the dependencies when considering minibatches of observations.
- Category: Applied Math and Analysis
- Duration: 01:24:48
- Date: September 4, 2019 at 11:55 AM
- Tags: seminar, Applied Math And Analysis Seminar