Reconstructing probability distributions from projections is a fundamental problem in many scientific applications. Geometric and information theoretic inequalities provide important mathematical tools for understanding the behavior of such projections---in particular, for characterizing extremal distributions with respect to different lower-dimensional properties of interest. This talk will consist of two parts: First, we introduce new methods to bound the size of an unseen geometric object using information derived from its lower-dimensional projections. Second, we present a new information inequality that relates the entropy of a random variable to that of its lower-dimensional marginals. Both parts highlight the advantages of working with information inequalities instead of their equivalent geometric or functional formulations.
Varun Jog received his B.Tech. degree in Electrical Engineering from IIT Bombay in 2010, and his Ph.D. in Electrical Engineering and Computer Sciences (EECS) from UC Berkeley in 2015. Since 2016, he is an Assistant Professor at the Electrical and Computer Engineering Department and a fellow at the Grainger Institute for Engineering at the University of Wisconsin - Madison. His research interests include information theory, machine learning, and network science. He is a recipient of the Eli Jury award from the EECS Department at UC Berkeley (2015) and the Jack Keil Wolf student paper award at ISIT 2015.
This seminar is supported with funds from the Korhammer Lecture Series