## Zhenyi Chen : A-infinity Sabloff Duality via the LSFT Algebra

- Geometry and Topology ( 0 Views )The Chekanov-Eliashberg dga is a powerful invariant for Legendrian links. Using augmentations of this dga, one can truncate its differential to produce linearized contact homology. About two decades ago, Sabloff established a duality in this setting, closely linked to the Poincaré duality of Lagrangian fillings. This truncation has since been generalized into a unital A-infinity category, Aug_+. In this talk, I will present new results that extend Sabloff duality from the level of cochain complexes to A-infinity bimodules over Aug_+. The key tool in this extension is Ng's LSFT algebra, which enlarges the Chekanov-Eliashberg dga. If time permits, I will also discuss how the LSFT algebra encodes additional homotopy coherent data, providing further insights into Sabloff duality.

## Xiaoqian Xu : Mixing flow and advection-diffusion-reaction equations

- Applied Math and Analysis ( 0 Views )In the study of incompressible fluid, one fundamental phenomenon that arises in a wide variety of applications is dissipation enhancement by so-called mixing flow. In this talk, I will give a brief introduction to the idea of mixing flow and the role it plays in the field of advection-diffusion-reaction equation. I will also discuss about the examples of such flows in this talk.

## Calvin McPhail-Snyder : Towards quantum complex Chern-Simons theory

- Geometry and Topology ( 0 Views )I will discuss recent joint work (with N. Reshetikhin) defining invariants ? of knot (and link and tangle) exteriors with flat ???? connections. The construction is via a geometric version of the Reshetikhin-Turaev construction: it is algebraic and relies on the representation theory of quantum groups. In this talk I will instead focus on the properties of these invariants and explain why I think they are a good candidate for quantum Chern-Simons theory with noncompact gauge group SL??(??). I will also discuss a connection with (and a generalization of) the Volume Conjecture.

## Sergey Cherkis : Gravitational Instantons: the Tesseron Landscape

- Geometry and Topology ( 0 Views )Since their introduction in Euclidean quantum gravity in mid-70??s, hyperkaehler Gravitational Instantons (aka tesserons) found their use in string theory and in supersymmetric quantum field theory. Their classification was recently completed and now their parameter space is being explored. We propose a systematic program of realizing each of these spaces as a moduli space of monopoles: the monopolization program. Monopolization reveals the combinatorial and geometric structure of the parameter space of all these spaces, equips each space with various natural structures (tautological bundles, Dirac-type operators, etc), and connects different types of integrable systems associated to these gravitational instantons.

## Theo McKenzie : Eigenvalue rigidity for random regular graphs

- Probability ( 0 Views )Random regular graphs form a ubiquitous model for chaotic systems. However, the spectral properties of their adjacency matrices have proven difficult to analyze because of the strong dependence between different entries. In this talk, I will describe recent work that shows that despite this, the fluctuation of eigenvalues of the adjacency matrix are of the same order as for Gaussian matrices. This gives an optimal error term for Friedman's theorem that the second eigenvalue of the adjacency matrix of a random regular graph converges to the spectral radius of an infinite regular tree. Crucial is tight analysis of the Green??s function of the adjacency operator and an analysis of the change of the Green's function after a random edge switch. This is based on joint work with Jiaoyang Huang and Horng-Tzer Yau.

## Xiang Cheng : Transformers learn in-context by (functional) gradient descent

- Applied Math and Analysis ( 0 Views )Motivated by the in-context learning phenomenon, we investigate how the Transformer neural network can implement learning algorithms in its forward pass. We show that a linear Transformer naturally learns to implement gradient descent, which enables it to learn linear functions in-context. More generally, we show that a non-linear Transformer can implement functional gradient descent with respect to some RKHS metric, which allows it to learn a broad class of functions in-context. Additionally, we show that the RKHS metric is determined by the choice of attention activation, and that the optimal choice of attention activation depends in a natural way on the class of functions that need to be learned. I will end by discussing some implications of our results for the choice and design of Transformer architectures.

## Laura Wakelin : Finding characterising slopes for all knots

- Geometry and Topology ( 0 Views )A slope p/q is characterising for a knot K if the oriented homeomorphism type of the 3-manifold obtained by performing Dehn surgery of slope p/q on K uniquely determines the knot K. For any knot K, there exists a bound C(K) such that any slope p/q with |q|?C(K) is characterising for K. This bound has previously been constructed for certain classes of knots, including torus knots, hyperbolic knots and composite knots. In this talk, I will give an overview of joint work with Patricia Sorya in which we complete this realisation problem for all remaining knots.

## Benjamin Seeger : Equations on the Wasserstein space and applications

- Probability ( 0 Views )The purpose of this talk is to give an overview of recent work involving differential equations posed on spaces of probability measures and their use in analyzing controlled multi-agent systems. The study of such systems has seen increased interest in recent years, due to their ubiquity in applications coming from macroeconomics, social behavior, and telecommunications. When the number of agents becomes large, the model can be formally replaced by one involving a mean-field description of the population, analogously to similar models in statistical physics. Justifying this continuum limit is often nontrivial and is sensitive to the type of stochastic noise influencing the population, i.e. idiosyncratic or systemic. We will describe settings for which the convergence to mean field stochastic control problems can be resolved through the analysis of a certain Hamilton-Jacobi-Bellman equation posed on Wasserstein spaces. In particular, we develop new stability and regularity results for the equations. These allow for new convergence results for more general problems, for example, zero-sum stochastic differential games of mean-field type. We conclude with a discussion of some further problems for which the techniques for equations on Wasserstein space may be amenable.

## Hongkai Zhao : Mathematical and numerical understanding of neural networks: from representation to learning dynamics

- Applied Math and Analysis ( 0 Views )In this talk I will present both mathematical and numerical analysis as well as experiments to study a few basic computational issues in using neural network to approximate functions: (1) the numerical error that can be achieved given a finite machine precision, (2) the learning dynamics and computation cost to achieve certain accuracy, and (3) structured and balanced approximation. These issues are investigated for both approximation and optimization in asymptotic and non-asymptotic regimes.