Abstract
The classical support vector machines regression (SVMR) is known as a regularized learning algorithm in reproducing kernel Hilbert spaces (RKHS) with a ε-insensitive loss function and an RKHS norm regularizer. In this paper, we study a new SVMR algorithm where the regularization term is proportional to l 1-norm of the coefficients in the kernel ensembles. We provide an error analysis of this algorithm, an explicit learning rate is then derived under some assumptions.
| Original language | English |
|---|---|
| Pages (from-to) | 1331-1344 |
| Number of pages | 14 |
| Journal | Journal of Approximation Theory |
| Volume | 164 |
| Issue number | 10 |
| DOIs | |
| State | Published - 2012 |
Keywords
- Coefficient regularization
- Error decomposition
- Learning rate
- Reproducing kernel Hilbert spaces
- Support vector machines regression
Fingerprint
Dive into the research topics of 'Support vector machines regression with l 1-regularizer'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver