TY - JOUR
T1 - Relations among some low-rank subspace recovery models
AU - Zhang, Hongyang
AU - Lin, Zhouchen
AU - Zhang, Chao
AU - Gao, Junbin
N1 - Imported on 12 Apr 2017 - DigiTool details were: Journal title (773t) = Neural Computation. ISSNs: 0899-7667;
PY - 2015
Y1 - 2015
N2 - Recovering intrinsic low-dimensional subspaces from data distributed on them is a key preprocessing step to many applications. In recent years, a lot of work has modeled subspace recovery as low-rank minimization problems.We find that some representative models, such as robust principal component analysis (R-PCA), robust low-rank representation (RLRR), and robust latent low-rank representation (R-LatLRR), are actually deeply connected. More specifically, we discover that once a solution to one of the models is obtained,we can obtain the solutions to othermodels in closed-form formulations. Since R-PCA is the simplest, our discovery makes it the center of low-rank subspace recovery models. Our work has two important implications. First, R-PCA has a solid theoretical foundation. Under certain conditions, we could find globally optimal solutions to these low-rank models at an overwhelming probability, although these models are nonconvex. Second, we can obtain significantly faster algorithms for these models by solving R-PCA first. The computation cost can be further cut by applying low-complexity randomized algorithms, for example, our novel '2,1 filtering algorithm, to R-PCA. Although for the moment the formal proof of our '2,1 filtering algorithm is not yet available, experiments verify the advantages of our algorithm over other state-of-the-art methods based on the alternating direction method.
AB - Recovering intrinsic low-dimensional subspaces from data distributed on them is a key preprocessing step to many applications. In recent years, a lot of work has modeled subspace recovery as low-rank minimization problems.We find that some representative models, such as robust principal component analysis (R-PCA), robust low-rank representation (RLRR), and robust latent low-rank representation (R-LatLRR), are actually deeply connected. More specifically, we discover that once a solution to one of the models is obtained,we can obtain the solutions to othermodels in closed-form formulations. Since R-PCA is the simplest, our discovery makes it the center of low-rank subspace recovery models. Our work has two important implications. First, R-PCA has a solid theoretical foundation. Under certain conditions, we could find globally optimal solutions to these low-rank models at an overwhelming probability, although these models are nonconvex. Second, we can obtain significantly faster algorithms for these models by solving R-PCA first. The computation cost can be further cut by applying low-complexity randomized algorithms, for example, our novel '2,1 filtering algorithm, to R-PCA. Although for the moment the formal proof of our '2,1 filtering algorithm is not yet available, experiments verify the advantages of our algorithm over other state-of-the-art methods based on the alternating direction method.
U2 - 10.1162/NECO_a_00762
DO - 10.1162/NECO_a_00762
M3 - Article
C2 - 26161818
SN - 0899-7667
VL - 27
SP - 1915
EP - 1950
JO - Neural Computation
JF - Neural Computation
IS - 9
ER -