Abstract: Distribution limits in large systems are often the key to understanding the fundamental limits or designing algorithms for inference and learning problems. Yet, sometimes, such distribution limit properties are not easily recognized, or only exist in some indirect forms. I would like to discuss two pieces of work with this flavor.
One piece of work (with Jain, Koehler, and Mossel, COLT 2019) concerns inference on regular trees. It is showed that in any near-optimal recursive reconstruction algorithm, the posterior expectation must tend to Gaussian in the Wasserstein-2 distance, even when the reconstruction nodes are memory-limited. This resolves in the positive a conjecture of Peres circa 2000 about the reconstruction threshold in the presence of memory limit.
The second piece of work (with Rigollet, NIPS 2019) proposed a framework for evaluating/comparing methods of constructing ``knockoff variables’’ in a false discovery rate (FDR) control algorithm by Barber and Candes. The key idea is to leverage and extend the Gaussian limit results for the LASSO estimator. This type of Gaussian limits originally appeared in the replica analysis of multiuser detection, and I would like to give a slightly different information-theoretic explanation of the genesis of Gaussianity.
Bio: Jingbo Liu is a Wiener postdoc at the MIT Institute for Data, Systems, and Society (IDSS). He obtained PhD from Princeton University, and BE from Tsinghua University, both in electrical engineering. As a postdoc at MIT, he worked on several topics in statistical inference, including: fundamental limits and algorithms for data with structures (graphical models); inference under system (memory, communication complexity) constraints; methods of false discovery rate control for feature selection. During PhD at Princeton University, he worked on various topics related to information theory, including security, multiuser, interactive, and non-asymptotic information theory, and presented semi-plenary and TPC choice talks at ISIT. His thesis proposed novel approaches to information-theoretic converses using methods from high dimensional probability and functional analysis, which was awarded the Thomas M. Cover Dissertation Award of the IEEE Information Theory Society. His undergraduate thesis on the robustness of nonconvex optimization won the best undergraduate thesis award of Tsinghua University.