Low rank representation on SPD matrices with log-Euclidean metric

Boyue Wang, Yongli Hu, Junbin Gao, Muhammad Ali, David Tien, Yanfeng Sun, Baocai Yin

Research output: Contribution to journalArticlepeer-review

13 Citations (Scopus)


Symmetric Positive semi-Definite (SPD) matrices, as a kind of effective feature descriptors, have been widely used in pattern recognition and computer vision tasks. Affine-invariant metric (AIM) is a popular way to measure the distance between SPD matrices, but it imposes a high computational burden in practice. Compared with AIM, the Log-Euclidean metric embeds the SPD manifold via the matrix logarithm into a Euclidean space in which only classical Euclidean computation is involved. The advantage of using this metric for the non-linear SPD matrices representation of data has been recognized in some domains such as compressed sensing, however one pays little attention to this metric in data clustering. In this paper, we propose a novel Low Rank Representation (LRR) model on SPD matrices space with Log-Euclidean metric (LogELRR), which enables us to handle non-linear data through a linear manipulation manner. To further explore the intrinsic geometry distance between SPD matrices, we embed the SPD matrices into Reproductive Kernel Hilbert Space (RKHS) to form a family of kernels on SPD matrices based on the Log-Euclidean metric and construct a novel kernelized LogELRR method. The clustering results on a wide range of datasets, including object images, facial images, 3D objects, texture images and medical images, show that our proposed methods overcome other conventional clustering methods.

Original languageEnglish
Pages (from-to)623-634
Number of pages12
JournalPattern Recognition
Early online dateJul 2017
Publication statusPublished - Apr 2018


Dive into the research topics of 'Low rank representation on SPD matrices with log-Euclidean metric'. Together they form a unique fingerprint.

Cite this