Title: An ℓ<sub>1</sub>-Penalization Of Adaptive Normalized Quasi-Newton Algorithm For Sparsity-Aware Generalized Eigenvector Estimation
Abstract: The goal of this paper is to establish a widely applicable method for exploiting the sparsity in generalized eigenvector estimation. We propose an ℓ <sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">1</sub> -penalized extension of the Adaptive normalized quasi-Newton algorithm (Nguyen and Yamada, 2013 . To enhance sparsity in the estimate of the generalized eigenvector, the proposed adaptive algorithm maximizes a certain non-convex criterion with ℓ <sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">1</sub> penalty. A convergence analysis is also given for the proposed algorithm with decaying weight. Numerical experiments show that the proposed algorithm improves the subspace tracking performance in the situation where the covariance matrix pencil has sparse principal generalized eigenvector and is effective for recent sparsity-aware eigenvector analysis, e.g., sparse PCA.
Publication Year: 2018
Publication Date: 2018-06-01
Language: en
Type: article
Indexed In: ['crossref']
Access and Citation
Cited By Count: 2
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot