Faculty Search Candidate: Ioannis Panageas
480 Dreese Labs
2015 Neil Ave, Columbus, Ohio 43210
Optimization and Multiplicative Weights Update Algorithm under the Lens of Dynamical Systems
In this talk we will describe two of our recent results. The first result is on first-order methods including gradient descent, block coordinate descent, mirror descent and variants thereof. Specifically, we will show that these well-known algorithms avoid saddle points for almost all initializations. Our techniques provide a unified theoretical framework for analyzing the asymptotic behavior of a wide variety of classic optimization algorithms in non-convex optimization. The second result is about Multiplicative Weights Update algorithm (MWU) on learning in games. We shall show that classic MWU converges to exact Nash equilibria for any congestion game and that this convergence result does not carry over for the exponential MWU variant, in which chaotic behavior might occur.
The connecting thread is that both results can be analyzed from a dynamical systems’ perspective.
Bio: Dr Ioannis Panageas is a MIT postdoctoral Fellow. He received his PhD in Algorithms, Combinatorics and Optimization from Georgia Institute of Technology under the guidance of Prasad Tetali in 2016. His work lies on the intersection of optimization, probability, learning theory, dynamical systems and algorithms. He has received numerous awards and fellowships during his graduate studies, including ARC, ACO and Onassis fellowships.
Host: Tamal Dey