Skip to main content

Faculty Candidate: Anastasios Kyrillidis

Univeristy of Texas at Austin

All dates for this event occur in the past.

03f230d.jpg

Anastasios Kyrillidis
480 Dreese Labs
2015 Neil Avenue
Columbus, Ohio 43210
 

Rethinking algorithms in Data Science: Scaling up optimization using non-convexity, provably

With the quantity of generated data ever-increasing in most research areas, conventional data analytics run into solid computational, storage, and communication bottlenecks. These obstacles force practitioners to often use algorithmic heuristics, in an attempt to convert data into useful information, fast. It is necessary to rethink the algorithmic design, and devise smarter and provable methods in order to flexibly balance the trade-offs between solution accuracy, efficiency, and data interpretability.

In this talk, I will focus on the problem of low rank matrix inference in large-scale settings. Such problems appear in fundamental applications such as structured inference, recommendation systems and multi-label classification problems. I will introduce a novel theoretical framework for analyzing the performance of non-convex first-order methods, often used as heuristics in practice. These methods lead to computational gains over classic convex approaches, but their analysis is unknown for most problems. This talk will provide precise theoretical guarantees, answering the long-standing question “why such non-convex techniques behave well in practice?” for a wide class of problems. 

Bio: Anastasios Kyrillidis received his PhD in Electrical and Computer Engineering from Ecole Polytechnique Federale de Lausanne (EPFL) in 2014. Currently, he is a Simons Fellowship PostDoc researcher at the University of Texas (Austin). His research interests include convex/non-convex optimization and analysis, large-scale machine learning, and high-dimensional data analysis.

 

Host: Leon Wang