Princeton University

School of Engineering & Applied Science

Good Margins Make Good Neighbors

Aryeh Kontorovich, Ben-Gurion University of the Negev
Engineering Quadrangle B205
Thursday, September 18, 2014 - 4:30pm to 5:30pm

<strong>Abstract:</strong> Although well-known by practitioners to be an effective classification tool, nearest-neighbor methods have been somewhat neglected by earning theory of late. The goal of this talk is to revive interest in this time-tested technique by recasting it in a modern perspective. We will present a paradigm of margin-regularized 1-nearest neighbor classification which: (i) is Bayes-consistent (ii) yields simple,usable finite-sample error bounds (iii) provides for very efficient algorithms with a principled speed-accuracy tradeoff (iv) allows for near-optimal sample compression. Further extensions include multiclass, regression, and metric dimensionality reduction. I will argue that the regularized 1-nearest neighbor is superior to k-nearest neighbors in several crucial statistical and computational aspects.

Based on a series of works with: Lee-Ad Gottlieb, Robert Krauthgamer, Roi Weiss
<strong>Biography:</strong> Aryeh Kontorovich received his undergraduate degree in mathematics with a certificate in applied mathematics from Princeton University in 2001. His M.Sc. and Ph.D. are from Carnegie Mellon University, where he graduated in 2007. After a postdoctoral fellowship at the Weizmann Institute of Science, he joined the Computer Science department at Ben-Gurion University of the Negev in 2009 as an assistant professor; this is his current position. His research interests are mainly in machine learning, with a focus on probability, statistics, automata theory and metric spaces.