# Sayan Mukherjee : Geometric Perspectives on Supervised Dimension Reduction

The statistical problem of supervised dimension reduction (SDR) is given observations of high-dimensional data as explanatory variables and univariate response variable, find a submanifold or subspace of the explanatory variables that predict the response. It is generally assumed that the data is concentrated on a low dimensional manifold in the high-dimensional space of explanatory variables.

The gradient of the manifold will be shown to be a central quantity in the problem of SDR. We will present a regularization algorithm for inferring the gradient geiven data. We will prove the rate of convergence of the gradient estimate to the gradient on the manifold of the true function to be of the order of the dimension of the manifold and not the much larger

The second part of the talk will rephrase the problem of SDR in a classical probabilistic (Bayesian) setting of mixture models of multivariate normals. An interesting result of this procedure is that the subspaces relevant to prediction are drawn from a posterior distribution on Grassmannian manifolds. For both methods efficacy on simulated and real data will be shown. ambient space.

**Category**: Applied Math and Analysis**Duration**: 01:34:53**Date**: October 19, 2009 at 4:25 PM**Views**: 117-
**Tags:**seminar, Elliptic Curves Working Seminar

## 0 Comments