Allan Seheult : Bayesian Forecasting and Calibration for Complex Phenomena Using Multi-level Computer Codes
- Other Meetings and Events ( 40 Views )We describe a general Bayesian approach for using computer codes for a complex physical system to assist in forecasting actual system outcomes. Our approach is based on expert judgements and experiments on fast versions of the computer code. These are combined to construct models for the relationships between the code's inputs and outputs, respecting the natural space/time features of the physical system. The resulting beliefs are systematically updated as we make evaluations of the code for varying input sets and calibrate the input space against past data on the system. The updated beliefs are then used to construct forecasts for future system outcomes. While the approach is quite general, it has been developed particularly to handle problems with high-dimensional input and output spaces, for which each run of the computer code is expensive. The methodology will be applied to problems in uncertainty analysis for hydrocarbon reservoirs.
Lillian Pierce : Carleson operators of Radon type
- Other Meetings and Events ( 40 Views )A celebrated theorem of Carleson shows that the Fourier series of an L^2 function converges pointwise almost everywhere. At the heart of this work lies an L^2 estimate for a particular type of maximal singular integral operator, which has since become known as a Carleson operator. In the past 40 years, a number of important results have been proved for generalizations of the original Carleson operator. In this talk we will introduce the Carleson operator and survey its generalizations, and then describe new joint work with Po Lam Yung on Carleson operators with a certain type of polynomial phase that also incorporate the behavior of Radon transforms.
Richard Kenyon : Random maps from Z^{2} to Z
- Other Meetings and Events ( 40 Views )One of the most basic objects in probability theory is the simple random walk, which one can think of as a random map from Z to Z mapping adjacent points to adjacent points. A similar theory for random maps from Z^{2} to Z had until recently remained elusive to mathematicians, despite being known (non-rigorously) to physicists. In this talk we discuss some natural families of random maps from Z^{2} to Z. We can explicitly compute both the local and the large-scale behavior of these maps. In particular we construct a "scaling limit" for these maps, in a similar sense in which Brownian motion is a scaling limit for the simple random walk. The results are in accord with physics.
Wendy Zhang : Drop breakup: asymmetric cones in viscous flow
- Other Meetings and Events ( 36 Views )Dynamic singularities are ubiquitous. They arise in mathematical models of phenomena as grand as star formation or as familiar as the breakup of a thread of honey as it is being added to tea. Drop breakup allows one to study dynamics close to a singularity in a simple context which is also accessible to experiments. Recent works have revealed that a viscous liquid drop close to breakup looks self-similar---the drop profile looks the same if the length scales are rescaled appropriately. A new numerical strategy is developed to capture the drop breakup dynamics and show good agreement with experimental measurements. Surprisingly, the presence of even small amounts of viscous dissipation in the surrounding can dramatically alter the self-similar profile. In particular, when no exterior viscous dissipation is present, the thread profile is symmetric about the point of pinch-off. When small amounts of exterior viscous dissipation are present, the thread profile becomes severely asymmetric. An understanding of the final breakup process is crucial in elucidating the mechanisms underlying the formation of satellite drops, an issue relevant to the development of ink-jet printing technologies and emulsification processes.
Jonathan Mattingly : Ergodicity of Stochastically Forced PDEs
- Other Meetings and Events ( 35 Views )Stochastic PDEs have become important models for many phenomenon. Nonetheless, many fundamental questions about their behavior remain poorly understood. Often such SPDE contain different processes active at different scales. Not only does such structure give rise to beautiful mathematics and phenomenon, but I submit that it also contains the key to answering many seemingly unrelated questions. Questions such as ergodicity and Mixing. Given a stochastically forced dissipative PDE such as the 2D Navier Stokes equations, the Ginzburg-Landau equations, or a reaction diffusion equation; is the system Ergodic ? If so, at what rate does the system equilibrate ? Is the convergence qualitatively different at different physical scales ? Answers to these an similar questions are basic assumptions of many physical theories such as theories of turbulence. I will try both to convince you why these questions are interesting and explain how to address them. The analysis will suggest strategies to explore other properties of these SPDEs as well as numerical methods. In particular, I will show that the stochastically forced 2D Navier Stokes equations converges exponentially to a unique invariant measure. I will discuss under what minimal conditions one should expect ergodic behavior. The central ideas will be illustrated with a simple model systems. Along the way I will explain how to exploit the different scales in the problem and how to overcome the fact that the problem is an extremely degenerate diffusion on an infinite dimensional function space. The analysis points to a class of operators in between STRICTLY ELLIPTIC and HYPOELLIPTIC operators which I call EFFECTIVELY ELLIPTIC. The techniques use a representation of the process on a finite dimensional space with memory. I will also touch on a novel coupling construction used to prove exponential convergence to equilibrium.
Dean Oliver : Sampling the Posterior Distribution for Reservoir Properties Conditional to Production Data
- Other Meetings and Events ( 29 Views )A major problem of Petroleum engineering si the prediction of future oil and water production from a reservoir whose properties are inferred from measurements along well paths, and from observations of pressure, production, and fluid saturations at well locations. If the properties of the porous material were known at all locations, and all boundary conditions were specified, the production rates of fluids would be computed from the numerical solution of a set of partial differential equations governing mass conservation and flow. Rock properties are known to be heterogeneous on many scales, however, and the measurements are always insufficient to determine the properties throughout the reservoir. In the petroleum and groundwater fields, rock properties (permeability and porosity) are modeled as spatial random fields, whose auto-covariance and cross-covariances are known from ovservations of outcrops and cores. Uncertainty in future production is characterized by the empirical distribution from the suite of realizations of rock properties. The problem is assessing uncertainty in reservoir production or groundwater remediation predictions is that while valid prodecures for sampling the posterior pdf are available, the computational cost of generating the necessary number of samples from such procedures is prohibitive. An increase in computer speed is unlikely to solve this problem as the trend has been to build more complex numerical models of the reservoir as computer capability increases. Most recent effort has gone in to approximate methods of sampling. In this talk, I will describe our experience with the use of Markov Chain Monte Carlo methods and with approximate sampling methods.