Quicklists
Javascript must be enabled

Yian Ma : Bridging MCMC and Optimization

root

139 Views

In this talk, I will discuss three ingredients of optimization theory in the context of MCMC: Non-convexity, Acceleration, and Stochasticity.

I will focus on a class of non-convex objective functions arising from mixture models. For that class of objective functions, I will demonstrate that the computational complexity of a simple MCMC algorithm scales linearly with the model dimension, while optimization problems are NP-hard.

I will then study MCMC algorithms as optimization over the KL-divergence in the space of measures. By incorporating a momentum variable, I will discuss an algorithm which performs "accelerated gradient descent" over the KL-divergence. Using optimization-like ideas, a suitable Lyapunov function is constructed to prove that an accelerated convergence rate is obtained.

Finally, I will present a general recipe for constructing stochastic gradient MCMC algorithms that translates the task of finding a valid sampler into one of choosing two matrices. I will then describe how stochastic gradient MCMC algorithms can be applied to applications involving temporally dependent data, where the challenge arises from the need to break the dependencies when considering minibatches of observations.

Please select playlist name from following

Report Video

Please select the category that most closely reflects your concern about the video, so that we can review it and determine whether it violates our Community Guidelines or isn’t appropriate for all viewers. Abusing this feature is also a violation of the Community Guidelines, so don’t do it.

0 Comments

Comments Disabled For This Video