Guest Speaker: Samory Kpotufe
Professor Samory Kpotufe
August 30, 4:00 pm, 480 Dreese Labs
Self-tuning in nonparametric regression
Contemporary statistical procedures are making inroads into a diverse range of applications in the natural sciences and engineering. However it is difficult to use those procedures "off-the-shelf" because they have to be properly tuned to the particular application.
In this talk, we present some "adaptive" regression procedures, i.e. procedures which self-tune, optimally, to the unknown parameters of the problem at hand.
We consider regression on a general metric space X of unknown dimension, where the output Y is given as f(x) + noise. We are interested in adaptivity at any input point x in X: the algorithm must self-tune to the unknown "local" parameters of the problem at x. The most important such parameters, are (1) the unknown smoothness of f, and (2) the unknown intrinsic dimension, both defined over a neighborhood of x.
Existing results on adaptivity have typically treated these two problem parameters separately, resulting in methods that solve only part of the self-tuning problem.
Using various regressors as an example, we first develop insight into tuning to unknown dimension. We then present an approach for kernel regression which allows simultaneous adaptivity to smoothness and dimension locally at a point x. This latest approach combines intuition for tuning to dimension, and intuition from so-called Lepski's methods for tuning to smoothness. The overall approach is likely to generalize to other nonparametric methods.
Bio: Samory Kpotufe is Assistant Professor in the ORFE department at Princeton University. He works in Machine Learning with an emphasis on high-dimensional problems and nonparametric estimation. Samory completed his PhD in 2010 at the University of California, San Diego (advised by Sanjoy Dasgupta), and spent a few years at the Max Planck Institute for Intelligent Systems (working with Ulrike von Luxburg and Bernhard Schoelkopf), and at TTI-Chicago.