TY - JOUR
T1 - Unsupervised learning low-rank tensor from incomplete and grossly corrupted data
AU - Meng, Zhijun
AU - Zhou, Yaoming
AU - Zhao, Yongjia
N1 - Publisher Copyright:
© 2018, Springer-Verlag London Ltd., part of Springer Nature.
PY - 2019/12/1
Y1 - 2019/12/1
N2 - Low-rank tensor completion and recovery have received considerable attention in the recent literature. The existing algorithms, however, are prone to suffer a failure when the multiway data are simultaneously contaminated by arbitrary outliers and missing values. In this paper, we study the unsupervised tensor learning problem, in which a low-rank tensor is recovered from an incomplete and grossly corrupted multidimensional array. We introduce a unified framework for this problem by using a simple equation to replace the linear projection operator constraint, and further reformulate it as two convex optimization problems through different approximations of the tensor rank. Two globally convergent algorithms, derived from the alternating direction augmented Lagrangian (ADAL) and linearized proximal ADAL methods, respectively, are proposed for solving these problems. Experimental results on synthetic and real-world data validate the effectiveness and superiority of our methods.
AB - Low-rank tensor completion and recovery have received considerable attention in the recent literature. The existing algorithms, however, are prone to suffer a failure when the multiway data are simultaneously contaminated by arbitrary outliers and missing values. In this paper, we study the unsupervised tensor learning problem, in which a low-rank tensor is recovered from an incomplete and grossly corrupted multidimensional array. We introduce a unified framework for this problem by using a simple equation to replace the linear projection operator constraint, and further reformulate it as two convex optimization problems through different approximations of the tensor rank. Two globally convergent algorithms, derived from the alternating direction augmented Lagrangian (ADAL) and linearized proximal ADAL methods, respectively, are proposed for solving these problems. Experimental results on synthetic and real-world data validate the effectiveness and superiority of our methods.
KW - Alternating direction augmented Lagrangian (ADAL)
KW - Convex optimization
KW - Low-rank tensor
KW - Tensor recovery
KW - Unsupervised learning
UR - https://www.scopus.com/pages/publications/85057600615
U2 - 10.1007/s00521-018-3899-x
DO - 10.1007/s00521-018-3899-x
M3 - 文章
AN - SCOPUS:85057600615
SN - 0941-0643
VL - 31
SP - 8327
EP - 8335
JO - Neural Computing and Applications
JF - Neural Computing and Applications
IS - 12
ER -