You are here

Focus on Faculty Talk: Brian Kulis

Thursday, October 9, 2014, 4:00 pm
Ohio State - CSE
480 Dreese Labs
2015 Neil Ave
Columbus, OH

Scalable Nonparametric Machine Learning

Nonparametric methods in machine learning do not fix the number of parameters (the model complexity) upfront, but instead allow the data itself to determine the number of parameters, making these methods attractive for various data analysis tasks.  Unfortunately, many nonparametric methods scale poorly with the number of data points, making their use in large-scale settings difficult.  Much of my recent research is concerned with developing new tools that allow for scalable nonparametric analysis of data.  In this talk, I will focus on a technique called small-variance asymptotics, which can be used to transform rich nonparametric statistical models into scalable combinatorial optimization problems.  In particular, in the classical (parametric) clustering setting, the EM algorithm for mixtures of Gaussians approaches the k-means algorithm as the variance of the clusters goes to zero.  Similarly, I will discuss how to apply a similar idea to a number of interesting nonparametric problems including clustering, feature learning, topic modeling, sequential data analysis, and cluster evolution over time.  We ultimately obtain scalable nonparametric algorithms that are often orders of magnitude faster than existing approaches.

Brian Kulis is an assistant professor of computer science at Ohio State University.   His research focuses on machine learning, statistics, computer vision, data mining, and large-scale optimization.  Previously, he was a postdoctoral fellow at UC Berkeley EECS and was also affiliated with the International Computer Science Institute.  He obtained his PhD in computer science from the University of Texas in 2008, and his BA degree from Cornell University in computer science and mathematics in 2003.  For his research, he has won three best student paper awards at top-tier conferences---two at the International Conference on Machine Learning (in 2005 and 2007) and one at the IEEE Conference on Computer Vision and Pattern Recognition (in 2008).  He was also the recipient of an MCD graduate fellowship from the University of Texas (2003-2007) and an Award of Excellence from the College of Natural Sciences at the University of Texas