Estimating High-dimensional Matrices: Statistical Limits and Computational Barriers
Prof. Yihong Wu
Statistical inference of large matrices arises frequently in the analysis of massive datasets such as principle component analysis, matrix completion, functional genomics, etc. In this talk I introduce a new machinery for studying estimation of high-dimensional matrices, which yields tight non-asymptotic minimax rates for a large collection of loss functions in a variety of problems. Based on the convex geometry of finite-dimensional Banach spaces, the minimax rates of oracle (unconstrained) denoising problem is determined for all unitarily invariant norms. This result is then extended to denoising with submatrix sparsity (noisy biclustering), where the excess risk depends on the sparsity constraints in a completely different manner. Moreover, the approach is applicable to the matrix completion under the low-rank constraint and extends far beyond the normal mean model. In the final part of the talk, I will give an example where attaining the minimax rate is provably hard in a complexity-theoretic sense. This observation reveals that there can exist a significant gap between the statistical fundamental limit and what can be achieved by computationally efficient procedures. This talk is based on joint work with Zongming Ma (Penn).
Yihong Wu received the B.E. degree from Tsinghua University, Beijing, China, in 2006 and the M.A. and Ph.D. degrees from Princeton University in 2008 and 2011, all in electrical engineering. Prior to joining the ECE department at University of Illinois in 2013 as an assistant professor, he was a postdoctoral fellow with the Statistics Department at The Wharton School of University of Pennsylvania. His research interests are in information theory, high-dimensional statistics, stochastic control and optimal transportation