Many modern problems in data science aim to efficiently and accurately extract important features and make predictions from high dimensional and large data sets. Naturally occurring structure in the data underpins the success of many contemporary approaches, but large gaps between theory and practice remain. In this talk, I will present recent progress on two different methods for nonparametric regression that can be viewed as the projection of a lifted formulation of the problem with a simple stochastic or convex geometric description, allowing the projection to encapsulate the data structure. In particular, I will first describe how the theory of stationary random tessellations in stochastic geometry can address the computational and theoretical challenges of random decision forests with non-axis-aligned splits. Second, I will present a new approach to convex regression that returns non-polyhedral convex estimators compatible with semidefinite programming. These works open many directions of future work at the intersection of stochastic and convex geometry, machine learning, and optimization.
We call a non-trivial homology 3-sphere a Kirby-Ramanujam sphere if it bounds a homology plane, an algebraic complex smooth surface with the same homology groups of the complex plane. In this talk, we present several infinite families of Kirby-Ramanujam spheres bounding Mazur type 4-manifolds, compact contractible smooth 4-manifolds built with only 0-, 1-, and 2-handles. Such an interplay between complex surfaces and 4-manifolds was first observed by Ramanujam and Kirby around nineteen-eighties. This is upcoming joint work with Rodolfo Aguilar Aguilar.