Title: Provable Subspace Clustering: When LRR meets SSC
Abstract: Sparse Subspace Clustering (SSC) and Low-Rank Representation (LRR) are both considered as the state-of-the-art methods for subspace clustering. The two methods are fundamentally similar in that both are convex optimizations exploiting the intuition of Self-Expressiveness. The main difference is that SSC minimizes the vector l1 norm of the representation matrix to induce sparsity while LRR minimizes nuclear norm (aka trace norm) to promote a low-rank structure. Because the representation matrix is often simultaneously sparse and low-rank, we propose a new algorithm, termed Low-Rank Sparse Subspace Clustering (LRSSC), by combining SSC and LRR, and develops theoretical guarantees of when the algorithm succeeds. The results reveal interesting insights into the strength and weakness of SSC and LRR and demonstrate how LRSSC can take the advantages of both methods in preserving the Self-Expressiveness Property and Graph Connectivity at the same time.
Publication Year: 2013
Publication Date: 2013-12-05
Language: en
Type: article
Access and Citation
Cited By Count: 159
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot