File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1016/j.jmva.2007.06.007
- Scopus: eid_2-s2.0-43049086717
- WOS: WOS:000256804400001
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: Sparse principal component analysis via regularized low rank matrix approximation
Title | Sparse principal component analysis via regularized low rank matrix approximation |
---|---|
Authors | |
Keywords | Dimension reduction High-dimension-low-sample-size Regularization Thresholding Singular value decomposition |
Issue Date | 2008 |
Citation | Journal of Multivariate Analysis, 2008, v. 99, n. 6, p. 1015-1034 How to Cite? |
Abstract | Principal component analysis (PCA) is a widely used tool for data analysis and dimension reduction in applications throughout science and engineering. However, the principal components (PCs) can sometimes be difficult to interpret, because they are linear combinations of all the original variables. To facilitate interpretation, sparse PCA produces modified PCs with sparse loadings, i.e. loadings with very few non-zero elements. In this paper, we propose a new sparse PCA method, namely sparse PCA via regularized SVD (sPCA-rSVD). We use the connection of PCA with singular value decomposition (SVD) of the data matrix and extract the PCs through solving a low rank matrix approximation problem. Regularization penalties are introduced to the corresponding minimization problem to promote sparsity in PC loadings. An efficient iterative algorithm is proposed for computation. Two tuning parameter selection methods are discussed. Some theoretical results are established to justify the use of sPCA-rSVD when only the data covariance matrix is available. In addition, we give a modified definition of variance explained by the sparse PCs. The sPCA-rSVD provides a uniform treatment of both classical multivariate data and high-dimension-low-sample-size (HDLSS) data. Further understanding of sPCA-rSVD and some existing alternatives is gained through simulation studies and real data examples, which suggests that sPCA-rSVD provides competitive results. © 2007 Elsevier Inc. All rights reserved. |
Persistent Identifier | http://hdl.handle.net/10722/219568 |
ISSN | 2023 Impact Factor: 1.4 2023 SCImago Journal Rankings: 0.837 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Shen, Haipeng | - |
dc.contributor.author | Huang, Jianhua Z. | - |
dc.date.accessioned | 2015-09-23T02:57:25Z | - |
dc.date.available | 2015-09-23T02:57:25Z | - |
dc.date.issued | 2008 | - |
dc.identifier.citation | Journal of Multivariate Analysis, 2008, v. 99, n. 6, p. 1015-1034 | - |
dc.identifier.issn | 0047-259X | - |
dc.identifier.uri | http://hdl.handle.net/10722/219568 | - |
dc.description.abstract | Principal component analysis (PCA) is a widely used tool for data analysis and dimension reduction in applications throughout science and engineering. However, the principal components (PCs) can sometimes be difficult to interpret, because they are linear combinations of all the original variables. To facilitate interpretation, sparse PCA produces modified PCs with sparse loadings, i.e. loadings with very few non-zero elements. In this paper, we propose a new sparse PCA method, namely sparse PCA via regularized SVD (sPCA-rSVD). We use the connection of PCA with singular value decomposition (SVD) of the data matrix and extract the PCs through solving a low rank matrix approximation problem. Regularization penalties are introduced to the corresponding minimization problem to promote sparsity in PC loadings. An efficient iterative algorithm is proposed for computation. Two tuning parameter selection methods are discussed. Some theoretical results are established to justify the use of sPCA-rSVD when only the data covariance matrix is available. In addition, we give a modified definition of variance explained by the sparse PCs. The sPCA-rSVD provides a uniform treatment of both classical multivariate data and high-dimension-low-sample-size (HDLSS) data. Further understanding of sPCA-rSVD and some existing alternatives is gained through simulation studies and real data examples, which suggests that sPCA-rSVD provides competitive results. © 2007 Elsevier Inc. All rights reserved. | - |
dc.language | eng | - |
dc.relation.ispartof | Journal of Multivariate Analysis | - |
dc.subject | Dimension reduction | - |
dc.subject | High-dimension-low-sample-size | - |
dc.subject | Regularization | - |
dc.subject | Thresholding | - |
dc.subject | Singular value decomposition | - |
dc.title | Sparse principal component analysis via regularized low rank matrix approximation | - |
dc.type | Article | - |
dc.description.nature | link_to_OA_fulltext | - |
dc.identifier.doi | 10.1016/j.jmva.2007.06.007 | - |
dc.identifier.scopus | eid_2-s2.0-43049086717 | - |
dc.identifier.volume | 99 | - |
dc.identifier.issue | 6 | - |
dc.identifier.spage | 1015 | - |
dc.identifier.epage | 1034 | - |
dc.identifier.eissn | 1095-7243 | - |
dc.identifier.isi | WOS:000256804400001 | - |
dc.identifier.issnl | 0047-259X | - |