Generalization errors of Laplacian regularized least squares regression

Research output: Contribution to journalArticlepeer-review

Abstract

Semi-supervised learning is an emerging computational paradigm for machine learning, that aims to make better use of large amounts of inexpensive unlabeled data to improve the learning performance. While various methods have been proposed based on different intuitions, the crucial issue of generalization performance is still poorly understood. In this paper, we investigate the convergence property of the Laplacian regularized least squares regression, a semi-supervised learning algorithm based on manifold regularization. Moreover, the improvement of error bounds in terms of the number of labeled and unlabeled data is presented for the first time as far as we know. The convergence rate depends on the approximation property and the capacity of the reproducing kernel Hilbert space measured by covering numbers. Some new techniques are exploited for the analysis since an extra regularizer is introduced.

Original languageEnglish
Pages (from-to)1859-1868
Number of pages10
JournalScience China Mathematics
Volume55
Issue number9
DOIs
StatePublished - Sep 2012

Keywords

  • covering number
  • graph Laplacian
  • learning rate
  • semi-supervised learning

Fingerprint

Dive into the research topics of 'Generalization errors of Laplacian regularized least squares regression'. Together they form a unique fingerprint.

Cite this