Quicklists
Javascript must be enabled

Sayan Mukherjee : Geometric Perspectives on Supervised Dimension Reduction (Oct 19, 2009 4:25 PM)

The statistical problem of supervised dimension reduction (SDR) is given observations of high-dimensional data as explanatory variables and univariate response variable, find a submanifold or subspace of the explanatory variables that predict the response. It is generally assumed that the data is concentrated on a low dimensional manifold in the high-dimensional space of explanatory variables.

The gradient of the manifold will be shown to be a central quantity in the problem of SDR. We will present a regularization algorithm for inferring the gradient geiven data. We will prove the rate of convergence of the gradient estimate to the gradient on the manifold of the true function to be of the order of the dimension of the manifold and not the much larger

The second part of the talk will rephrase the problem of SDR in a classical probabilistic (Bayesian) setting of mixture models of multivariate normals. An interesting result of this procedure is that the subspaces relevant to prediction are drawn from a posterior distribution on Grassmannian manifolds. For both methods efficacy on simulated and real data will be shown. ambient space.

Please select playlist name from following

Report Video

Please select the category that most closely reflects your concern about the video, so that we can review it and determine whether it violates our Community Guidelines or isn’t appropriate for all viewers. Abusing this feature is also a violation of the Community Guidelines, so don’t do it.

0 Comments

Comments Disabled For This Video