Quicklists
public 01:24:47

Franca Hoffmann : Gradient Flows: From PDE to Data Analysis.

  -   Applied Math and Analysis ( 184 Views )

Certain diffusive PDEs can be viewed as infinite-dimensional gradient flows. This fact has led to the development of new tools in various areas of mathematics ranging from PDE theory to data science. In this talk, we focus on two different directions: model-driven approaches and data-driven approaches. In the first part of the talk we use gradient flows for analyzing non-linear and non-local aggregation-diffusion equations when the corresponding energy functionals are not necessarily convex. Moreover, the gradient flow structure enables us to make connections to well-known functional inequalities, revealing possible links between the optimizers of these inequalities and the equilibria of certain aggregation-diffusion PDEs. We present recent results on properties of these equilibria and long-time asymptotics of solutions in the setting where attractive and repulsive forces are in competition. In the second part, we use and develop gradient flow theory to design novel tools for data analysis. We draw a connection between gradient flows and Ensemble Kalman methods for parameter estimation. We introduce the Ensemble Kalman Sampler - a derivative-free methodology for model calibration and uncertainty quantification in expensive black-box models. The interacting particle dynamics underlying our algorithm can be approximated by a novel gradient flow structure in a modified Wasserstein metric which reflects particle correlations. The geometry of this modified Wasserstein metric is of independent theoretical interest.

public 01:34:49

Xiantao Li : The Mori-Zwanzig formalism for the reduction of complex dynamics models

  -   Applied Math and Analysis ( 128 Views )

Mathematical models of complex physical processes often involve large number of degrees of freedom as well as events occurring on different time scales. Therefore, direct simulations based on these models face tremendous challenge. This focus of this talk is on the Mori-Zwanzig (MZ) projection formalism for reducing the dimension of a complex dynamical system. The goal is to mathematically derive a reduced model with much fewer variables, while still able to capture the essential properties of the system. In many cases, this formalism also eliminates fast modes and makes it possible to explore events over longer time scales. The models that are directly derived from the MZ projection are typically too abstract to be practically implemented. We will first discuss cases where the model can be simplified to generalized Langevin equations (GLE). Furthermore, we introduce systematic numerical approximations to the GLE, in which the fluctuation-dissipation theorem (FDT) is automatically satisfied. More importantly, these approximations lead to a hierarchy of reduced models with increasing accuracy, which would also be useful for an adaptive model refinement (AMR). Examples, including the NLS, atomistic models of materials defects, and molecular models of proteins, will be presented to illustrate the potential applications of the methods.

public 01:14:42

Rongjie Lai : Understanding Manifold-structured Data via Geometric Modeling and Learning

  -   Applied Math and Analysis ( 113 Views )

Analyzing and inferring the underlying global intrinsic structures of data from its local information are critical in many fields. In practice, coherent structures of data allow us to model data as low dimensional manifolds, represented as point clouds, in a possible high dimensional space. Different from image and signal processing which handle functions on flat domains with well-developed tools for processing and learning, manifold-structured data sets are far more challenging due to their complicated geometry. For example, the same geometric object can take very different coordinate representations due to the variety of embeddings, transformations or representations (imagine the same human body shape can have different poses as its nearly isometric embedding ambiguities). These ambiguities form an infinite dimensional isometric group and make higher-level tasks in manifold-structured data analysis and understanding even more challenging. To overcome these ambiguities, I will first discuss modeling based methods. This approach uses geometric PDEs to adapt the intrinsic manifolds structure of data and extracts various invariant descriptors to characterize and understand data through solutions of differential equations on manifolds. Inspired by recent developments of deep learning, I will also discuss our recent work of a new way of defining convolution on manifolds and demonstrate its potential to conduct geometric deep learning on manifolds. This geometric way of defining convolution provides a natural combination of modeling and learning on manifolds. It enables further applications of comparing, classifying and understanding manifold-structured data by combing with recent advances in deep learning.

public 01:14:47

Cynthia Rudin : 1) Regulating Greed Over Time: An Important Lesson For Practical Recommender Systems and 2) Prediction Uncertainty and Optimal Experimental Design for Learning Dynamical Systems

  -   Applied Math and Analysis ( 113 Views )

I will present work from these two papers: 1) Regulating Greed Over Time. Stefano Traca and Cynthia Rudin. 2015 Finalist for 2015 IBM Service Science Best Student Paper Award 2) Prediction Uncertainty and Optimal Experimental Design for Learning Dynamical Systems. Chaos, 2016. Benjamin Letham, Portia A. Letham, Cynthia Rudin, and Edward Browne.
There is an important aspect of practical recommender systems that we noticed while competing in the ICML Exploration-Exploitation 3 data mining competition. The goal of the competition was to build a better recommender system for Yahoo!'s Front Page, which provides personalized new article recommendations. The main strategy we used was to carefully control the balance between exploiting good articles and exploring new ones in the multi-armed bandit setting. This strategy was based on our observation that there were clear trends over time in the click-through-rates of the articles. At certain times, we should explore new articles more often, and at certain times, we should reduce exploration and just show the best articles available. This led to dramatic performance improvements.
As it turns out, the observation we made in the Yahoo! data is in fact pervasive in settings where recommender systems are currently used. This observation is simply that certain times are more important than others for correct recommendations to be made. This affects the way exploration and exploitation (greed) should change in our algorithms over time. We thus formalize a setting where regulating greed over time can be provably beneficial. This is captured through regret bounds and leads to principled algorithms. The end result is a framework for bandit-style recommender systems in which certain times are more important than others for making a correct decision.
If time permits I will discuss work on measuring uncertainty in parameter estimation for dynamical systems. I will present "prediction deviation," a new metric of uncertainty that determines the extent to which observed data have constrained the model's predictions. This is accomplished by solving an optimization problem that searches for a pair of models that each provide a good fit for the observed data, yet have maximally different predictions. We develop a method for estimating a priori the impact that additional experiments would have on the prediction deviation, allowing the experimenter to design a set of experiments that would most reduce uncertainty.