Many modern problems in data science aim to efficiently and accurately extract important features and make predictions from high dimensional and large data sets. Naturally occurring structure in the data underpins the success of many contemporary approaches, but large gaps between theory and practice remain. In this talk, I will present recent progress on two different methods for nonparametric regression that can be viewed as the projection of a lifted formulation of the problem with a simple stochastic or convex geometric description, allowing the projection to encapsulate the data structure. In particular, I will first describe how the theory of stationary random tessellations in stochastic geometry can address the computational and theoretical challenges of random decision forests with non-axis-aligned splits. Second, I will present a new approach to convex regression that returns non-polyhedral convex estimators compatible with semidefinite programming. These works open many directions of future work at the intersection of stochastic and convex geometry, machine learning, and optimization.
- Uploaded by nolen ( 591 Views )