Regularized reduced-rank regression for structured output prediction

  • Heng Chen
  • , Di Rong Chen
  • , Kun Cheng*
  • , Yang Zhou
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Reduced-rank regression (RRR) has been widely used to strength the dependency among multiple outputs. This paper develops a regularized vector-valued RRR approach, which plays an important role in predicting multiple outputs with structures. The estimator of vector-valued RRR is obtained by minimizing the empirically squared reproducing kernel Hilbert space (RKHS) distances between output feature kernel and all r dimensional subspaces in vector-valued RKHS. The algorithm is implemented easily with kernel tricks. We establish the learning rate of vector-valued RRR estimator under mild assumptions. Moreover, as a reduced-dimensional approximation of output kernel regression function, the estimator converges to the output regression function in probability when the rank r tends to infinity appropriately. It implies the consistency of structured predictor in general settings, especially in a misspecified case where the true regression function is not contained in the hypothesis space. Numerical experiments are provided to illustrate the efficiency of our method.

Original languageEnglish
Article number101977
JournalJournal of Complexity
Volume92
DOIs
StatePublished - Feb 2026

Keywords

  • Function approximation
  • Reduced rank method
  • Reproducing kernel Hilbert space
  • Statistical learning
  • Structured prediction

Fingerprint

Dive into the research topics of 'Regularized reduced-rank regression for structured output prediction'. Together they form a unique fingerprint.

Cite this