Learning sparse PCA with stabilized ADMM method on stiefel manifold


Sparse principal component analysis (SPCA) produces principal components with sparse loadings, which is very important for handling data with many irrelevant features and also critical to interpret the results. To deal with orthogonal constraints, most previous approaches address SPCA with several components by techniques such as deflation technique and convex relaxations. However, the deflation technique usually suffers from suboptimal solutions due to poor approximations. In addition, the convex relaxations are always expensive to address. In this paper, we propose to address SPCA over the Stiefel manifold directly, and develop a stabilized Alternating Direction Method of Multipliers (SADMM) to handle the nonconvex orthogonal constraints in SPCA. Compared to traditional ADMM, the proposed method converges well with a wide range of parameters. We further propose a two-stage method which considers the importance of components to select the most important features. The convergence of SADMM has been theoretically studied. Empirical studies on both synthetic and real-world data sets show that the proposed algorithms achieve better performance compared to existing state-of-the-art methods.

IEEE Transactions on Knowledge and Data Engineering
Jiezhang Cao
Jiezhang Cao
Ph.D. student

I am a lucky boy.