Wednesday, February 26, 2020 - 10:00am to 11:45am
Location:ASA Conference Room 6115 Gates Hillman Centers
Speaker:NAVID AZIZAN, Ph.D. Candidate http://www.its.caltech.edu/~nazizanr/
Large-Scale Optimization and Nonconvexity: From Deep Learning to Energy Markets
One of the main challenges in designing and analyzing optimization algorithms for large-scale systems is nonconvexity. In this talk, I will focus on an important, and perhaps the most prevalent, nonconvex problem of the modern era: training deep neural networks. I will present recent results that show that, while these problems are nonconvex, they are overparameterized in a way that simplifies the optimization and makes stochastic descent algorithms converge to global minima, which can be interpreted as a “blessing” of dimensionality. I will show using both theoretical and experimental results that, in these overparameterized settings, the optimization algorithms play a critical role in the generalization of deep learning by biasing the model towards special solutions, a phenomenon referred to as implicit regularization. Further, I will present an approach for controlling the form of implicit regularization using the family of stochastic mirror descent (SMD) algorithms, which is a generalization of the popular stochastic gradient descent (SGD). The results I present include both characterization theorems and an experimental exploration of the implicit regularization of SMD algorithms. Beyond deep learning, I will also briefly highlight some of my other work on large-scale optimization, highlighting the importance of nonconvexity for market design in the smart grid via an exploration of the problem of pricing in markets with nonconvex costs.
Navid Azizan is a fifth-year Ph.D. candidate in Computing and Mathematical Sciences (CMS) at the California Institute of Technology (Caltech), where he is co-advised by Adam Wierman and Babak Hassibi. He was named an Amazon Fellow in Artificial Intelligence in 2017 and a PIMCO Fellow in Data Science in 2018. Additionally, he was a research intern at Google DeepMind in 2019, and received the ITA Graduation-Day Gold Award in 2020. His research on electricity markets received the ACM GREENMETRICS Best Student Paper Award in 2016. He was also the first-place winner and a gold medalist at the 2008 National Physics Olympiad in Iran. His research interests broadly lie in mathematical optimization, machine learning, networks, and control. His work has focused on the design and analysis of optimization algorithms for nonconvex and networked problems, with real-world applications such as in deep learning, energy markets, distributed computation, and social networks. He received the B.Sc. degree in EE from Sharif University of Technology and the M.Sc. degree in ECE from the University of Southern California, in 2013 and 2015, respectively.
Faculty Host: Mor Harchol-Balter
Computer Science Department