Princeton University

School of Engineering & Applied Science

Approximation Algorithms for Large-Scale Data Analysis

Speaker: 
Ludwig Schmidt, Massachusetts Institute of Technology
Location: 
E-Quad, B205
Date/Time: 
Thursday, March 16, 2017 - 4:30pm to 5:30pm

Abstract:
Over the past decade, the scale of computational problems in machine learning has grown tremendously. At the same time, the slowdown of Moore’s Law is making it harder to apply existing algorithms to large data sets. As this trend will only increase in the foreseeable future, we need faster algorithms for many computational problems in machine learning.
 
In this talk, I will show how ideas from approximation algorithms can be used to speed up multiple key tasks in machine learning. Since the underlying statistical problems are inherently noisy, computing a good approximate solution often leads to significantly faster algorithms that match the statistical performance of their exact counterparts. One such connection is in the area of constrained estimation, which underlies many problems such as compressive sensing, sparse linear regression, and matrix completion. By introducing approximate projections, I will connect classical tools from theoretical computer science with these estimation tasks.
 
In the second part of the talk, I will turn to classification and give strong evidence that approximation is inherently necessary in order to get sub-quadratic running time for widely used learning algorithms such as kernel methods and neural networks. On the other hand, I will show how ideas from nearest neighbor algorithms can exploit problem structure and speed up large-multiclass networks.
 
Bio:
Ludwig Schmidt is a PhD student at MIT, advised by Prof. Piotr Indyk. Ludwig’s research interests revolve around algorithmic aspects of machine learning, statistics, and signal processing. Ludwig received a Google PhD Fellowship in machine learning, a Simons-Berkeley research fellowship, and a best paper award at the International Conference on Machine Learning (ICML).