I'll discuss work on shift invariance in a half space setting. These are non-trivial symmetries allowing certain observables of integrable models with a boundary to be shifted while preserving their joint distribution. The starting point is the colored stochastic six vertex model in a half space, from which we obtain results on the asymmetric simple exclusion process, as well as for the beta polymer through a fusion procedure, both in a half space setting. An application to the asymptotics of a half space analogue of the oriented swap process is also given.
Mariana Olvera-Cravioto : Opinion dynamics on complex networks: From mean-field limits to sparse approximations- Uploaded by schrett ( 8 Views )
In a world of polarized opinions on many cultural issues, we propose a model for the evolution of opinions on a large complex network. Our model is akin to the popular Friedkin-Johnsen model, with the added complexity of vertex-dependent media signals and confirmation bias, both of which help explain some of the most important factors leading to polarization. The analysis of the model is done on a directed random graph, capable of replicating highly inhomogeneous real-world networks with various degrees of assortativity and community structure. Our main results give the stationary distribution of opinions on the network, including explicitly computable formulas for the conditional means and variances for the various communities. Our results span the entire range of inhomogeneous random graphs, from the sparse regime, where the expected degrees are bounded, all the way to the dense regime, where a graph having n vertices has order n^2 edges.
We consider an ensemble of N interacting particles modeled by a system of N stochastic differential equations (SDEs). The coefficients of the SDEs are taken to be such that as N approaches infinity, the system undergoes Kac’s propagation of chaos, and is well-approximated by the solution to a McKean-Vlasov Equation. Rare but possible deviations of the behavior of the particles from this limit may reflect a catastrophe, and computing the probability of such rare events is of high interest in many applications. In this talk, we design an importance sampling scheme which allows us to numerically compute statistics related to these rare events with high accuracy and efficiency for any N. Standard Monte Carlo methods behave exponentially poorly as N increases for such problems. Our scheme is based on subsolutions of a Hamilton-Jacobi-Bellman (HJB) Equation on Wasserstein Space which arises in the theory of mean-field control. This HJB Equation is seen to be connected to the large deviations rate function for the empirical measure on the ensemble of particles. We identify conditions under which our scheme is provably asymptotically optimal in N in the sense of log-efficiency. We also provide evidence, both analytical and numerical, that with sufficient regularity of the solution to the HJB Equation, our scheme can have vanishingly small relative error as N increases.
Jake Madrid : Stochastic Extinction events in Large Populations Prior to Entering the Metastable State- Uploaded by schrett ( 22 Views )
We will explore the role of demographic stochasticity in triggering extinction events in models of large finite populations. While prior works have focused on large fluctuations from quasi-stationary distributions, we instead consider extinction events occurring before entering a metastable state. Since such extinction events require only slight deviations from the mean-field trajectories, we can derive the approximating extinction probability PDE with a modified Robin-type boundary condition. We then investigate the utility of this approximation by comparing to the Lotka-Volterra model as well as the Lotka-Volterra model with logistic growth.
This talk is an overview of my thesis work, which consists of 3 projects exploring the effect of multiscale structure on a class of interacting particle systems called weakly interacting diffusions. In the absence of multiscale structure, we have a collection of N particles, with the dynamics of each being described by the solution to a stochastic differential equation (SDE) whose coefficients depend on that particle's state and the empirical measure of the full particle configuration. It is well known in this setting that as N approaches infinity, the particle system undergoes the ``propagation of chaos,'' and its corresponding sequence of empirical measures converges to the law of the solution to an associated McKean-Vlasov SDE. Meanwhile, in our multiscale setting, the coefficients of the SDEs may also depend on a process evolving on a timescale of order 1/\epsilon faster than the particles. As \epsilon approaches 0, the effect of the fast process on the particles' dynamics becomes deterministic via stochastic homogenization. We study the interplay between homogenization and the propagation of chaos via establishing large deviations and moderate deviations results for the multiscale particles' empirical measure in the combined limit as N approaches infinity and \epsilon approaches 0. Along the way, we derive rates of homogenization for slow-fast McKean-Vlasov SDEs.
In the late 20th century, statistical physicists introduced a chemical reaction model called ballistic annihilation. In it, particles are placed randomly throughout the real line and then proceed to move at independently sampled velocities. Collisions result in mutual annihilation. Many results were inferred by physicists, but it wasn’t until recently that mathematicians joined in. I will describe my trajectory through this model. Expect tantalizing open questions.
This will be the last in his sequence of an introductory lecture on Hypocoercivity for Langevin dynamics. For those who have not attended the previous lectures and are familiar with Langevin dynamics, the talk should be accessible. We will continue our discussion on convergence to equilibrium for second-order Langevin dynamics using the Poincare approach. We'll recap convergence in H^1(\mu) and then we'll talk about the direct L^2(\mu) method of Dolbeault, Mouhot, and Schmeiser, also called the DMS approach.
An emerging way to protect privacy is to replace true data by synthetic data. Medical records of artificial patients, for example, could retain meaningful statistical information while preserving privacy of the true patients. But what is synthetic data, and what is privacy? How do we define these concepts mathematically? Is it possible to make synthetic data that is both useful and private? I will tie these questions to a simple-looking problem in probability theory: how much information about a random vector X is lost when we take conditional expectation of X with respect to some sigma-algebra? This talk is based on a series of papers with March Boedihardjo and Thomas Strohmer.
Pratima Hebbar, Probability Seminar on October 21, 2021
David Aldous, Probability Seminar Sept 30, 2021 TITLE: Can one prove existence of an infectiousness threshold (for a pandemic) in very general models of disease spread? ABSTRACT: Intuitively, in any kind of disease transmission model with an infectiousness parameter, there should exist a critical value of the parameter separating a very likely from a very unlikely resulting pandemic. But even formulating a general conjecture is challenging. In the most simplistic model (SI) of transmission, one can prove this for an essentially arbitrary large weighted contact network. The proof for SI depends on a simple lemma concerning hitting times for increasing set-valued Markov processes. Can one extend to SIR or SIS models over similarly general networks, where the lemma is no longer applicable?
SEPC 2021 in honor of Elizabeth Meckes. Slides from the talks and more information are available <a href="https://services.math.duke.edu/~rtd/SEPC2021/SEPC2021.html">at this link (here).</a>
Description of some work with Elizabeth Meckes at SEPC 2021
There are a number of situations in which rescaled interacting particle systems have been shown to converge to a reaction diffusion equation (RDE) with a bistable reaction term. These RDEs have traveling wave solutions. When the speed of the wave is nonzero, block constructions have been used to prove the existence or nonexistence of nontrivial stationary distributions. Here, we follow the approach in a paper by Etheridge, Freeman, and Pennington to show that in a wide variety of examples when the RDE limit has a bistable reaction term and traveling waves have speed 0, one can run time faster and further rescale space to obtain convergence to motion by mean curvature. This opens up the possibility of proving that the sexual reproduction model with fast stirring has a discontinuous phase transition, and that in Region 2 of the phase diagram for the nonlinear voter model studied by Molofsky et al there were two nontrivial stationary distributions.
A key question in population biology is understanding the conditions under which the species of an ecosystem persist or go extinct. Theoretical and empirical studies have shown that persistence can be facilitated or negated by both biotic interactions and environmental fluctuations. We study the dynamics of n interacting species that live in a stochastic environment. Our models are described by n dimensional piecewise deterministic Markov processes. These are processes (X(t), r(t)) where the vector X denotes the density of the n species and r(t) is a finite state space process which keeps track of the environment. In any fixed environment the process follows the flow given by a system of ordinary differential equations. The randomness comes from the changes or switches in the environment, which happen at random times. We give sharp conditions under which the populations persist as well as conditions under which some populations go extinct exponentially fast. As an example we look at the competitive exclusion principle from ecology, which says in its simplest form that two species competing for one resource cannot coexist, and show how the random switching can facilitate coexistence.
Quasi-Stationary Distributions (QSDs) describe the long-time behaviour of killed Markov processes. The Fleming-Viot particle system provides a particle representation for the QSD of a Markov process killed upon contact with the boundary of its domain. Whereas previous work has dealt with killed Markov processes, we consider killed McKean-Vlasov processes. We show that the Fleming-Viot particle system with McKean-Vlasov dynamics provides a particle representation for the corresponding QSDs. Joint work with James Nolen.
Data lying in a high dimensional ambient space are commonly thought to have a much lower intrinsic dimension. In particular, the data may be concentrated near a lower-dimensional subspace or manifold. There is an immense literature focused on approximating the unknown subspace and the unknown density, and exploiting such approximations in clustering, data compression, and building of predictive models. Most of the literature relies on approximating subspaces and densities using a locally linear, and potentially multiscale, dictionary with Gaussian kernels. In this talk, we propose a simple and general alternative, which instead uses pieces of spheres, or spherelets, to locally approximate the unknown subspace. I will also introduce a curved kernel called the Fisher–Gaussian (FG) kernel which outperforms multivariate Gaussians in many cases. Theory is developed showing that spherelets can produce lower covering numbers and mean square errors for many manifolds, as well as the posterior consistency of the Dirichlet process mixture of the FG kernels. Time permitting, I will also talk about an ongoing project about stochastic differential geometry.