Quasi-Stationary Distributions (QSDs) describe the long-time behaviour of killed Markov processes. The Fleming-Viot particle system provides a particle representation for the QSD of a Markov process killed upon contact with the boundary of its domain. Whereas previous work has dealt with killed Markov processes, we consider killed McKean-Vlasov processes. We show that the Fleming-Viot particle system with McKean-Vlasov dynamics provides a particle representation for the corresponding QSDs. Joint work with James Nolen.
An emerging way to protect privacy is to replace true data by synthetic data. Medical records of artificial patients, for example, could retain meaningful statistical information while preserving privacy of the true patients. But what is synthetic data, and what is privacy? How do we define these concepts mathematically? Is it possible to make synthetic data that is both useful and private? I will tie these questions to a simple-looking problem in probability theory: how much information about a random vector X is lost when we take conditional expectation of X with respect to some sigma-algebra? This talk is based on a series of papers with March Boedihardjo and Thomas Strohmer.
I will introduce and discuss a canonical notion of Brownian motion in the random geometry of Liouville quantum gravity, called Liouville Brownian motion. I will explain the construction and discuss some of its basic properties, for instance related to its heat kernel and to the time spent in the thick points of the Gaussian Free Field. Time permitting I will also discuss a derivation of the KPZ formula based on the Liouville heat kernel (joint work with C. Garban. R. Rhodes and V. Vargas).
In this talk I will revisit the idea of viewing the Bayesian update as a variational problem. I will show how the variational interpretation is helpful in establishing the convergence of Bayesian models, and in defining and analysing diffusion processes that have the posterior as invariant measure. I will illustrate the former by proving a consistency result for graph-based Bayesian semi-supervised learning in the large unlabelled data-set regime, and the latter by suggesting new optimality criteria for the choice of metric in Riemannian MCMC.
Recently, Markov-chain Monte Carlo methods based on non-reversible piecewise deterministic Markov processes (PDMP) are under growing attention, thanks to the increase in performance they usually bring. Beyond their numerical efficacy, the non-reversible and piecewise deterministic characteristics of these processes prompt interesting questions, regarding for instance ergodicity proof and convergence bounds. During this talk, I will particularly focus on the obtained results and open problems left while considering PDMP evolution of particle systems, both in an equilibrium and out-of-equilibrium setting. Hardcore particle systems have embodied a testbed of choice since the first implementations of Markov chain Monte Carlo in the 50’s. Even today, the entropic barriers they exhibit are still resisting to the state-of-the-art MCMC sampling methods. During this talk, I will review the recent developments regarding sampling such systems and discuss the dynamical bottlenecks that are yet to be solved.