Princeton University

School of Engineering & Applied Science

Why and How to Estimate Mutual Information

Kartik Venkat
E-Quad, B205
Monday, February 16, 2015 - 4:30pm

Mutual information emerged in the latter half of the twentieth century, as the answer to the most fundamental questions in compression and communication. Since that time, however, it has been adopted and widely used for inference in a variety of other disciplines spanning science and engineering. The first part of the talk will be dedicated to a recent set of results that justify and explain the wide use of mutual information as an inferential tool. The subsequent and main part of the talk will be dedicated to a new approach to the estimation of mutual information between random objects with distributions residing in high-dimensional spaces. The resulting estimators enjoy strong theoretical performance guarantees. At the same time, their potential for boosting inferential power when used in lieu of the traditional ones appears to be significant, as will be demonstrated with timely applications on real and simulated data. The last part will be dedicated to the speaker's additional related recent and ongoing work, and some future directions. 

Kartik Venkat is a Ph.D. candidate in the Department of Electrical Engineering at Stanford University. His research interests include statistical inference, information theory, machine learning, and their applications in genomics, wireless networks, neuroscience, and quantitative finance. Kartik received a Bachelor’s degree in Electrical Engineering from the Indian Institute of Technology, Kanpur in 2010, and a Master's degree in Electrical Engineering from Stanford University in 2012. His honors include a Stanford Graduate Fellowship for Engineering and Sciences, the Numerical Technologies Founders Prize, and a Jack Keil Wolf ISIT Student Paper Award at the 2012 International Symposium on Information Theory.