Discrete structure recovery is an important topic in modern high-dimensional inference. Examples of discrete structure include clustering labels, ranks of players, and signs of variables in a regression model. In the first part of my talk, I will discuss a general algorithmic framework that efficiently learns structure of the data. The general algorithm is a simple iterative procedure that probably converges to an error rate that is minimax optimal. When applied to specific examples, it recovers important algorithms such as Lloyd’s iteration and iterative feature matching. In the second part of my talk, I will discuss a hypothesis testing problem for discrete structure inference. The detection boundary of this problem has five different regions, and can be achieved adaptively by a variant of the higher criticism test.
Chao Gao is an assistant professor at University of Chicago. He obtained his PhD from Yale with Harrison Zhou in 2016. His research areas are nonparametric and high-dimensional statistics, network analysis, Bayes theory and robust statistics.