Quicklists
public 44:55
public 40:00

Paul Aspinwall : String Theory and Geometry

  -   Graduate/Faculty Seminar ( 200 Views )

public 26:00
public 01:29:51

Karin Leiderman : Mathematical Modeling of Thrombosis

  -   Mathematical Biology ( 107 Views )

public 01:34:52

Lek-Heng Lim : Multilinear Algebra and Its Applications

  -   Applied Math and Analysis ( 109 Views )

In mathematics, the study of multilinear algebra is largely limited to properties of a whole space of tensors --- tensor products of k vector spaces, modules, vector bundles, Hilbert spaces, operator algebras, etc. There is also a tendency to take an abstract coordinate-free approach. In most applications, instead of a whole space of tensors, we are often given just a single tensor from that space; and it usually takes the form of a hypermatrix, i.e.\ a k-dimensional array of numerical values that represents the tensor with respect to some coordinates/bases determined by the units and nature of measurements. How could one analyze this one single tensor then? If the order of the tensor k = 2, then the hypermatrix is just a matrix and we have access to a rich collection of tools: rank, determinant, norms, singular values, eigenvalues, condition number, etc. This talk is about the case when k > 2. We will see that one may often define higher-order analogues of common matrix notions rather naturally: tensor ranks, hyperdeterminants, tensor norms (Hilbert-Schmidt, spectral, Schatten, Ky Fan, etc), tensor eigenvalues and singular values, etc. We will discuss the utility as well as difficulties of various tensorial analogues of matrix problems. In particular we shall look at how tensors arise in a variety of applications including: computational complexity, control engineering, mathematical biology, neuroimaging, quantum computing, signal processing, spectroscopy, and statistics.