TGDA@OSU Guest Speaker: Dr. Ken Clarkson
Dr. Ken Clarkson
480 Dreese Labs
2015 Neil Avenue
Columbus, Ohio 43210
Low-rank PSD Approximation in Input-Sparsity Time
A number of matrices that arise in machine learning and data analysis are symmetric positive semidefinite (PSD), including covariance matrices, kernel matrices, Laplacian matrices, random dot product graph models, and others. A common task related to such matrices is to approximate them with a low-rank matrix, for efficiency or statistical inference; spectral clustering, kernel PCA, manifold learning, and Gaussian process regression can all involve this task. Given a square n by n matrix A, target rank k, and error parameter epsilon, we show how to find a PSD matrix B of rank k, such that the Frobenius norm of A-B is within a 1+epsilon factor of best possible; our algorithm needs O(nnz(A) + n*poly(k/epsilon)) time to do this, where nnz(A) is the number of nonzero entries of A. We also show how to find such a rank-k matrix PSD B of the form CUC^T, where the O(k/eps) columns of C are a subset of those of A, with a similar runtime.
Joint work with David Woodruff.
Bio: Ken Clarkson is a research staff member and manager of the theory group at IBM Research -- Almaden. He received his PhD from Stanford University, advised by Andrew C.-C. Yao. He was a member of the technical staff of AT&T Bell Laboratories, Lucent Bell Labs, and Alcatel-Lucent Bell Labs, and has taught at Stanford University and the University of Pennsylvania. He has been a recipient of Best Paper Awards at the Vehicular Technology Conference and at STOC. His research focuses on geometric and matrix algorithms.
Hosts: Tamal Dey and Yusu Wang