Geometry and Topology in Dimension Reduction
Geometry and Topology in Dimension Reduction
-
Sayan Mukherjee, Duke University
Fine Hall 214
In the first part of the talk we describe how learning the gradient of a regression function can be used for supervised dimension reduction (SDR). We provide an algorithm for learning gradients in high-dimensional data, provide theoretical guarantees for the algorithm, and provide a statistical interpretation. Comparisons to other methods on real and simulated data are presented. In the second part of the talk we present preliminary results on using the Laplacian on forms for dimension reduction. This involves understanding higher-order versions of the isoperimetric inequality for both manfifolds and abstract simplicial complexes.