TY - JOUR

T1 - A hybrid algorithm for low-rank approximation of nonnegative matrix factorization

AU - Wang, Peitao

AU - He, Zhaoshui

AU - Xie, Kan

AU - Gao, Junbin

AU - Antolovich, Michael

AU - Tan, Beihai

PY - 2019/10/28

Y1 - 2019/10/28

N2 - Nonnegative matrix factorization (NMF) is a recently developed method for data analysis. So far, most of known algorithms for NMF are based on alternating nonnegative least squares (ANLS) minimization of the squared Euclidean distance between the original data matrix and its low-rank approximation. In this paper, we first develop a new NMF algorithm, in which a Procrustes rotation and a nonnegative projection are alternately performed. The new algorithm converges very rapidly. Then, we propose a hybrid NMF (HNMF) algorithm that combines the new algorithm with the low-rank approximation based NMF (lraNMF) algorithm. Furthermore, we extend the HNMF algorithm to nonnegative Tucker decomposition (NTD), which leads to a hybrid NTD (HNTD) algorithm. The simulations verify that the HNMF algorithm performs well under various noise conditions, and HNTD has a comparable performance to the low-rank approximation based sequential NTD (lraSNTD) algorithm for sparse representation of tensor objects.

AB - Nonnegative matrix factorization (NMF) is a recently developed method for data analysis. So far, most of known algorithms for NMF are based on alternating nonnegative least squares (ANLS) minimization of the squared Euclidean distance between the original data matrix and its low-rank approximation. In this paper, we first develop a new NMF algorithm, in which a Procrustes rotation and a nonnegative projection are alternately performed. The new algorithm converges very rapidly. Then, we propose a hybrid NMF (HNMF) algorithm that combines the new algorithm with the low-rank approximation based NMF (lraNMF) algorithm. Furthermore, we extend the HNMF algorithm to nonnegative Tucker decomposition (NTD), which leads to a hybrid NTD (HNTD) algorithm. The simulations verify that the HNMF algorithm performs well under various noise conditions, and HNTD has a comparable performance to the low-rank approximation based sequential NTD (lraSNTD) algorithm for sparse representation of tensor objects.

KW - Alternately updating

KW - Low-rank approximation

KW - Nonnegative matrix factorization

KW - Nonnegative Tucker decomposition

UR - http://www.scopus.com/inward/record.url?scp=85069739255&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85069739255&partnerID=8YFLogxK

U2 - 10.1016/j.neucom.2019.07.059

DO - 10.1016/j.neucom.2019.07.059

M3 - Article

AN - SCOPUS:85069739255

SN - 0925-2312

VL - 364

SP - 129

EP - 137

JO - Neurocomputing

JF - Neurocomputing

ER -