Many problems arising from scientific and engineering applications can be naturally formulated as optimization problems, most of which are nonconvex. For nonconvex problems, obtaining a local minimizer is computationally hard in theory, never mind the global minimizer. In practice, however, simple numerical methods often work surprisingly well in finding high-quality solutions for specific problems at hand.
In this talk, I will describe our recent effort in bridging the mysterious theory-practice gap for nonconvex optimization. I will highlight a family of nonconvex problems that can be solved to global optimality using simple numerical methods, independent of initialization. This family has the characteristic global structure that (1) all local minimizers are global, and (2) all saddle points have directional negative curvatures. Problems lying in this family cover various applications across machine learning, signal processing, scientific imaging, and more. I will focus on two examples we worked out: learning sparsifying bases for massive data and recovery of complex signals from phaseless measurements. In both examples, the benign global structure allows us to derive geometric insights and computational results that are inaccessible from previous methods. In contrast, alternative approaches to solving nonconvex problems often entail either expensive convex relaxation (e.g., solving large-scale semidefinite programs) or delicate problem-specific initializations.
Completing and enriching this framework is an active research endeavor that is being undertaken by several research communities. At the end of the talk, I will discuss open problems to be tackled to move forward.
Ju Sun is a postdoctoral scholar at Stanford University, working with Professor Emmanuel Candѐs. Prior to this, he received his Ph.D. degree from Electrical Engineering of Columbia University in 2016 (2011 – 2016) and B.Eng. degree in computer engineering (with a minor in Mathematics) from the National University of Singapore in 2008 (2004 – 2008). His research interests lie at the intersection of computer vision, machine learning, numerical optimization, signal/image processing, information theory, and compressive sensing, focused on modeling, harnessing, and computing with low-dimensional structures in massive high-dimensional data, with practical algorithms and correctness guarantees. Recently, he is particularly interested in explaining the surprisingly effectiveness of nonconvex optimization heuristics arising from practical problems. He maintains a webpage dedicated to this topic: http://sunju.org/research/nonconvex/ . He won the best student paper award from SPARS'15.
When Are Nonconvex Optimization Problems Not Scary?