BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Date iCal//NONSGML kigkonsult.se iCalcreator 2.20.2//
METHOD:PUBLISH
X-WR-CALNAME;VALUE=TEXT:Electrical Engineering
BEGIN:VTIMEZONE
TZID:America/New_York
BEGIN:STANDARD
DTSTART:20191103T020000
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
END:STANDARD
BEGIN:DAYLIGHT
DTSTART:20200308T020000
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
END:DAYLIGHT
END:VTIMEZONE
BEGIN:VEVENT
UID:calendar.3771.field_events_date.0@ee.princeton.edu
DTSTAMP:20191016T063300Z
CATEGORIES:Electrical Engineering\, Seminars
CREATED:20190826T170223Z
DESCRIPTION:Abstract:\nStochastic iterative methods lie at the core of larg
e-scale optimization and its modern applications to data science. Though s
uch algorithms are routinely and successfully used in practice on highly i
rregular problems (e.g. deep neural networks)\, few performance guarantees
are available outside of smooth or convex settings. In this talk\, I will
describe a framework for designing and analyzing stochastic methods on a
large class of nonsmooth and nonconvex problems\, with provable efficiency
guarantees. The problem class subsumes such important tasks as phase retr
ieval\, robust PCA\, and minimization of risk measures\, while the methods
include stochastic subgradient\, Gauss-Newton\, and proximal point iterat
ions. The main thread of the proposed framework is appealingly intuitive.
I will show that a wide variety of stochastic methods can be interpreted a
s inexact gradient descent on an implicit smoothing of the problem.\nOptim
al learning rates and novel sample-complexity guarantees (for various sign
al and matrix recovery problems) follow quickly from this viewpoint.\nBio:
\nDmitriy Drusvyatskiy received his PhD from the Operations Research and I
nformation Engineering department at Cornell University in 2013\, followed
by a post doctoral appointment in the Combinatorics and Optimization depa
rtment at Waterloo\, 2013-2014. He joined the Mathematics department at Un
iversity of Washington as an Assistant Professor in 2014\, and was promote
d to an Associate Professor in 2019.\nDmitriyâ€™s research broadly focuses o
n designing and analyzing algorithms for large-scale optimization problems
\, primarily motivated by applications in data science. Dmitriy has receiv
ed a number of awards\, including the Air Force Office of Scientific Resea
rch (AFOSR) Young Investigator Program (YIP) Award\, NSF CAREER\, INFORMS
Optimization Society Young Researcher Prize 2019\, and finalist citations
for the Tucker Prize 2015 and the Young Researcher Best Paper Prize at ICC
OPT 2019. Dmitriy is currently a co-PI of the NSF funded Transdisciplinary
Research in Principles of Data Science (TRIPODS) institute at University
of Washington.
DTSTART;TZID=America/New_York:20191114T163000
DTEND;TZID=America/New_York:20191114T173000
LAST-MODIFIED:20190930T161035Z
LOCATION:Engineering QUAD (B205)
SUMMARY:Convergence Rates of Stochastic Algorithms in Nonsmooth Nonconvex O
ptimization
URL;TYPE=URI:https://ee.princeton.edu/events/convergence-rates-stochastic-a
lgorithms-nonsmooth-nonconvex-optimization
END:VEVENT
END:VCALENDAR