Asymptotics of objective functionals in semi-supervised learning

Printer-friendly version
Speaker: 
Dejan Slepcev
Institution: 
Carnegie Mellon
Time: 
Mon, 10/16/2017 - 4:00pm - 5:10pm
Host: 
Hongkai Zhao
Location: 
RH 306

We consider a family of  regression problems in a semi-supervised setting. Given real-valued labels on a small subset of data the task is to recover the function on the whole data set while taking advantage of the (geometric) structure provided by the large number of unlabeled data points. We consider a random  geometric graph to represent the geometry of the data set.  We study objective  functions  which reward the regularity of the estimator function and impose or reward the agreement with the training data. In particular we consider discrete p-Laplacian and fractional Laplacian regularizations.
 We investigate asymptotic behavior in the limit where the number of unlabeled points increases while the number of training  points remains fixed. We uncover a delicate interplay between the regularizing nature of the functionals considered and the nonlocality inherent to the graph constructions. We rigorously obtain  almost optimal ranges on the scaling of the graph connectivity  radius for the asymptotic consistency to hold. The talk is based on joint works with Matthew Dunlop, Andrew Stuart, and Matthew Thorpe.