Quicklists
public 01:44:51

Ruth Williams : Control of Stochastic Processing Networks

  -   Probability ( 188 Views )

Stochastic processing networks (SPNs) are a significant generalization of conventional queueing networks that allow for flexible scheduling through dynamic sequencing and alternate routing. SPNs arise naturally in a variety of applications in operations management and their control and analysis present challenging mathematical problems. One approach to these problems, via approximate diffusion control problems, has been outlined by J. M. Harrison. Various aspects of this approach have been developed mathematically, including a reduction in dimension of the diffusion control problem. However, other aspects have been less explored, especially, solution of the diffusion control problem, derivation of policies by interpretating such solutions, and limit theorems that establish optimality of such policies in a suitable asymptotic sense. In this talk, for a concrete class of networks called parallel server systems which arise in service network and computer science applications, we explore previously undeveloped aspects of Harrison's scheme and illustrate the use of the approach in obtaining simple control policies that are nearly optimal. Identification of a graphical structure for the network, an invariance principle and properties of local times of reflecting Brownian motion, will feature in our analysis. The talk will conclude with a summary of the current status and description of open problems associated with the further development of control of stochastic processing networks. This talk will draw on aspects of joint work with M. Bramson, M. Reiman, W. Kang and V. Pesic.

public 01:34:47

Zachary Bezemek : Interacting particle systems in multiscale environments: asymptotic analysis

  -   Probability ( 70 Views )

This talk is an overview of my thesis work, which consists of 3 projects exploring the effect of multiscale structure on a class of interacting particle systems called weakly interacting diffusions. In the absence of multiscale structure, we have a collection of N particles, with the dynamics of each being described by the solution to a stochastic differential equation (SDE) whose coefficients depend on that particle's state and the empirical measure of the full particle configuration. It is well known in this setting that as N approaches infinity, the particle system undergoes the ``propagation of chaos,'' and its corresponding sequence of empirical measures converges to the law of the solution to an associated McKean-Vlasov SDE. Meanwhile, in our multiscale setting, the coefficients of the SDEs may also depend on a process evolving on a timescale of order 1/\epsilon faster than the particles. As \epsilon approaches 0, the effect of the fast process on the particles' dynamics becomes deterministic via stochastic homogenization. We study the interplay between homogenization and the propagation of chaos via establishing large deviations and moderate deviations results for the multiscale particles' empirical measure in the combined limit as N approaches infinity and \epsilon approaches 0. Along the way, we derive rates of homogenization for slow-fast McKean-Vlasov SDEs.

public 01:02:33

Zack Bezemek : Large Deviations and Importance Sampling for Weakly Interacting Diffusions

  -   Probability ( 40 Views )

We consider an ensemble of N interacting particles modeled by a system of N stochastic differential equations (SDEs). The coefficients of the SDEs are taken to be such that as N approaches infinity, the system undergoes Kac’s propagation of chaos, and is well-approximated by the solution to a McKean-Vlasov Equation. Rare but possible deviations of the behavior of the particles from this limit may reflect a catastrophe, and computing the probability of such rare events is of high interest in many applications. In this talk, we design an importance sampling scheme which allows us to numerically compute statistics related to these rare events with high accuracy and efficiency for any N. Standard Monte Carlo methods behave exponentially poorly as N increases for such problems. Our scheme is based on subsolutions of a Hamilton-Jacobi-Bellman (HJB) Equation on Wasserstein Space which arises in the theory of mean-field control. This HJB Equation is seen to be connected to the large deviations rate function for the empirical measure on the ensemble of particles. We identify conditions under which our scheme is provably asymptotically optimal in N in the sense of log-efficiency. We also provide evidence, both analytical and numerical, that with sufficient regularity of the solution to the HJB Equation, our scheme can have vanishingly small relative error as N increases.