A New Approach to Robust and Flexible High-dimensional Statistics
Friday, February 4, 3PM
Dimensionality reduction is core to modern statistics and learning; several methods therein are based on structural assumptions on the data. Popular examples of such structural models include sparsity (e.g. in LASSO, Compressed Sensing etc.), rank deficiency (in PCA and derivative methods), sparse Markov structure, etc. This talk introduces the notion of dimensionality reduction via the simultaneous use of more than structural model. Our approach yields methods that are much more widely applicable, and significantly more robust, than existing ones - often with only slightly larger computational complexity. We present new methods, and corresponding analytical results, for (a) PCA in the presence of arbitrary outliers (b) PCA in the presence of grossly corrupted data (c) Robust Collaborative filtering (d) Multiple sparse regression with partially shared sparsity (e) Graph/ correlation clustering. Our methods are based on convex optimization. The talk aims to be self-contained.
Host: K. Ren