## Tristan Leger : Global existence, scattering, and propagation of moments for inhomogeneous kinetic equations.

- Applied Math and Analysis ( 0 Views )The field of derivation of kinetic equations has seen many impressive advances recently. Yet the well-posedness and dynamics of these equations remain poorly understood. In this talk I will address such questions, and present a method to prove global existence, scattering and propagation of moments for inhomogeneous kinetic equations. It uses dispersive estimates for free transport, combined with kinetic theory techniques to deal with the specific difficulties brought by the structure of the equation under consideration (e.g. its cross section, the degree of the nonlinearity). I will discuss its concrete implementation for the kinetic wave and Boltzmann equations. This is based on joint work with Ioakeim Ampatzoglou.

## Thomas Weighill : Optimal transport methods for visualizing redistricting plans

- Applied Math and Analysis ( 0 Views )Ensembles of redistricting plans can be challenging to analyze and visualize because every plan is an unordered set of shapes, and therefore non-Euclidean in at least two ways. I will describe two methods designed to address this challenge: barycenters for partitioned datasets, and a novel dimension reduction technique based on Gromov-Wasserstein distance. I will cover some of the theory behind these methods and show how they can help us untangle redistricting ensembles to find underlying trends. This is joint work with Ranthony A. Clark and Tom Needham.

## Xiaoqian Xu : Mixing flow and advection-diffusion-reaction equations

- Applied Math and Analysis ( 0 Views )In the study of incompressible fluid, one fundamental phenomenon that arises in a wide variety of applications is dissipation enhancement by so-called mixing flow. In this talk, I will give a brief introduction to the idea of mixing flow and the role it plays in the field of advection-diffusion-reaction equation. I will also discuss about the examples of such flows in this talk.

## Xiang Cheng : Transformers learn in-context by (functional) gradient descent

- Applied Math and Analysis ( 0 Views )Motivated by the in-context learning phenomenon, we investigate how the Transformer neural network can implement learning algorithms in its forward pass. We show that a linear Transformer naturally learns to implement gradient descent, which enables it to learn linear functions in-context. More generally, we show that a non-linear Transformer can implement functional gradient descent with respect to some RKHS metric, which allows it to learn a broad class of functions in-context. Additionally, we show that the RKHS metric is determined by the choice of attention activation, and that the optimal choice of attention activation depends in a natural way on the class of functions that need to be learned. I will end by discussing some implications of our results for the choice and design of Transformer architectures.

## Hongkai Zhao : Mathematical and numerical understanding of neural networks: from representation to learning dynamics

- Applied Math and Analysis ( 0 Views )In this talk I will present both mathematical and numerical analysis as well as experiments to study a few basic computational issues in using neural network to approximate functions: (1) the numerical error that can be achieved given a finite machine precision, (2) the learning dynamics and computation cost to achieve certain accuracy, and (3) structured and balanced approximation. These issues are investigated for both approximation and optimization in asymptotic and non-asymptotic regimes.

## Sanchit Chaturvedi : Phase mixing in astrophysical plasmas with an external Kepler potential

- Applied Math and Analysis ( 85 Views )In Newtonian gravity, a self-gravitating gas around a massive object such as a star or a planet is modeled via Vlasov Poisson equation with an external Kepler potential. The presence of this attractive potential allows for bounded trajectories along which the gas neither falls in towards the object or escape to infinity. We focus on this regime and prove first a linear phase mixing result in 3D outside symmetry with exact Kepler potential. Then we also prove a long-time nonlinear phase mixing result in spherical symmetry. The mechanism is phenomenologically similar to Landau damping on a torus but mathematically the situation is quite a lot more complex. This is based on an upcoming joint work with Jonathan Luk at Stanford.

## Vakhtang Poutkaradze : Lie-Poisson Neural Networks (LPNets): Data-Based Computing of Hamiltonian Systems with Symmetries

- Applied Math and Analysis ( 57 Views )Physics-Informed Neural Networks (PINNs) have received much attention recently due to their potential for high-performance computations for complex physical systems, including data-based computing, systems with unknown parameters, and others. The idea of PINNs is to approximate the equations and boundary and initial conditions through a loss function for a neural network. PINNs combine the efficiency of data-based prediction with the accuracy and insights provided by the physical models. However, applications of these methods to predict the long-term evolution of systems with little friction, such as many systems encountered in space exploration, oceanography/climate, and many other fields, need extra care as the errors tend to accumulate, and the results may quickly become unreliable. We provide a solution to the problem of data-based computation of Hamiltonian systems utilizing symmetry methods. Many Hamiltonian systems with symmetry can be written as a Lie-Poisson system, where the underlying symmetry defines the Poisson bracket. For data-based computing of such systems, we design the Lie-Poisson neural networks (LPNets). We consider the Poisson bracket structure primary and require it to be satisfied exactly, whereas the Hamiltonian, only known from physics, can be satisfied approximately. By design, the method preserves all special integrals of the bracket (Casimirs) to machine precision. LPNets yield an efficient and promising computational method for many particular cases, such as rigid body or satellite motion (the case of SO(3) group), Kirchhoff's equations for an underwater vehicle (SE(3) group), and others. Joint work with Chris Eldred (Sandia National Lab), Francois Gay-Balmaz (CNRS and ENS, France), and Sophia Huraka (U Alberta). The work was partially supported by an NSERC Discovery grant.

## Cole Graham : Fisher-KPP traveling waves in the half-space

- Applied Math and Analysis ( 92 Views )Reaction-diffusion equations are widely used to model spatial propagation, and constant-speed "traveling waves" play a central role in their dynamics. These waves are well understood in "essentially 1D" domains like cylinders, but much less is known about waves with noncompact transverse structure. In this direction, we will consider traveling waves of the Fisherâ??KPP reaction-diffusion equation in the Dirichlet half-space. We will see that minimal-speed waves are unique (unlike faster waves) and exhibit curious asymptotics. The arguments rest on the theory of conformal maps and a powerful connection with the probabilistic system known as branching Brownian motion.

This is joint work with Julien Berestycki, Yujin H. Kim, and Bastien Mallein.

## Zane Li : Interpreting a classical argument for Vinogradovs Mean Value Theorem into decoupling language

- Applied Math and Analysis ( 155 Views )There are two proofs of Vinogradov's Mean Value Theorem (VMVT), the harmonic analysis decoupling proof by Bourgain, Demeter, and Guth from 2015 and the number theoretic efficient congruencing proof by Wooley from 2017. While there has been some work illustrating the relation between these two methods, VMVT has been around since 1935. It is then natural to ask: What does previous partial progress on VMVT look like in harmonic analysis language? How similar or different does it look from current decoupling proofs? We talk about a classical argument due to Karatsuba that shows VMVT "asymptotically" and interpret this in decoupling language. This is joint work with Brian Cook, Kevin Hughes, Olivier Robert, Akshat Mudgal, and Po-Lam Yung.