Abstract
Reduced-rank regression (RRR) has been widely used to strength the dependency among multiple outputs. This paper develops a regularized vector-valued RRR approach, which plays an important role in predicting multiple outputs with structures. The estimator of vector-valued RRR is obtained by minimizing the empirically squared reproducing kernel Hilbert space (RKHS) distances between output feature kernel and all r dimensional subspaces in vector-valued RKHS. The algorithm is implemented easily with kernel tricks. We establish the learning rate of vector-valued RRR estimator under mild assumptions. Moreover, as a reduced-dimensional approximation of output kernel regression function, the estimator converges to the output regression function in probability when the rank r tends to infinity appropriately. It implies the consistency of structured predictor in general settings, especially in a misspecified case where the true regression function is not contained in the hypothesis space. Numerical experiments are provided to illustrate the efficiency of our method.
| Original language | English |
|---|---|
| Article number | 101977 |
| Journal | Journal of Complexity |
| Volume | 92 |
| DOIs | |
| State | Published - Feb 2026 |
Keywords
- Function approximation
- Reduced rank method
- Reproducing kernel Hilbert space
- Statistical learning
- Structured prediction
Fingerprint
Dive into the research topics of 'Regularized reduced-rank regression for structured output prediction'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver