Estimating low-rank matrices using principal components analysis (PCA) is ubiquitous in data analysis workflows throughout science and engineering. Somewhat surprisingly, simple variants of PCA that incorporate structure, side information, or model flexibility face significant information-theoretic and computational challenges. Crucially, we do not understand whether we can incorporate structure in matrix estimation reliably and efficiently using low-complexity algorithms.
I will illustrate how we can overcome these challenges using tools and insights from the theory of mean field spin glasses in statistical physics. As concrete examples, I use two special cases of matrix estimation: the hidden clique problem and stochastic block models. In their context, I will show how these tools yield state-of-the-art low complexity algorithms for inference. I also demonstrate how they uncover new information-theoretic and computational threshold phenomena.
Yash Deshpande received a B.Tech in Electrical Engineering from the Indian Institute of Technology, Bombay in 2011, and an MS and Ph.D in Electrical Engineering in 2016 from Stanford University. He is presently a Schramm fellow at the Department of Mathematics, MIT and Microsoft Research, New England.