Skip to main navigation Skip to search Skip to main content

Support vector machines regression with l 1-regularizer

  • Hongzhi Tong*
  • , Di Rong Chen
  • , Fenghong Yang
  • *Corresponding author for this work
  • University of International Business and Economics
  • Central University of Finance and Economics

Research output: Contribution to journalArticlepeer-review

Abstract

The classical support vector machines regression (SVMR) is known as a regularized learning algorithm in reproducing kernel Hilbert spaces (RKHS) with a ε-insensitive loss function and an RKHS norm regularizer. In this paper, we study a new SVMR algorithm where the regularization term is proportional to l 1-norm of the coefficients in the kernel ensembles. We provide an error analysis of this algorithm, an explicit learning rate is then derived under some assumptions.

Original languageEnglish
Pages (from-to)1331-1344
Number of pages14
JournalJournal of Approximation Theory
Volume164
Issue number10
DOIs
StatePublished - 2012

Keywords

  • Coefficient regularization
  • Error decomposition
  • Learning rate
  • Reproducing kernel Hilbert spaces
  • Support vector machines regression

Fingerprint

Dive into the research topics of 'Support vector machines regression with l 1-regularizer'. Together they form a unique fingerprint.

Cite this