File Download
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1137/17M1140480
- Scopus: eid_2-s2.0-85053602057
- WOS: WOS:000453716400008
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: Computing low-rank approximations of large-scale matrices with the Tensor Network randomized SVD
Title | Computing low-rank approximations of large-scale matrices with the Tensor Network randomized SVD |
---|---|
Authors | |
Keywords | Curse of dimensionality Low-rank tensor approximation Matrix factorization Matrix product operator Randomized algorithm |
Issue Date | 2018 |
Publisher | Society for Industrial and Applied Mathematics. The Journal's web site is located at http://www.siam.org/journals/simax.php |
Citation | SIAM Journal on Matrix Analysis and Applications, 2018, v. 39 n. 3, p. 1221-1244 How to Cite? |
Abstract | We propose a new algorithm for the computation of a singular value decomposition (SVD) low-rank approximation of a matrix in the matrix product operator (MPO) format, also called the tensor train matrix format. Our tensor network randomized SVD (TNrSVD) algorithm is an MPO implementation of the randomized SVD algorithm that is able to compute dominant singular values and their corresponding singular vectors. In contrast to the state-of-the-art tensor-based alternating least squares SVD (ALS-SVD) and modified alternating least squares SVD (MALS-SVD) matrix approximation methods, TNrSVD can be up to 13 times faster while achieving better accuracy. In addition, our TNrSVD algorithm also produces accurate approximations in particular cases where both ALS-SVD and MALS-SVD fail to converge. We also propose a new algorithm for the fast conversion of a sparse matrix into its corresponding MPO form, which is up to 509 times faster than the standard tensor train SVD method while achieving machine precision accuracy. The efficiency and accuracy of both algorithms are demonstrated in numerical experiments. |
Persistent Identifier | http://hdl.handle.net/10722/261769 |
ISSN | 2023 Impact Factor: 1.5 2023 SCImago Journal Rankings: 1.042 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Batselier, K | - |
dc.contributor.author | Yu, W | - |
dc.contributor.author | Daniel, L | - |
dc.contributor.author | Wong, N | - |
dc.date.accessioned | 2018-09-28T04:47:34Z | - |
dc.date.available | 2018-09-28T04:47:34Z | - |
dc.date.issued | 2018 | - |
dc.identifier.citation | SIAM Journal on Matrix Analysis and Applications, 2018, v. 39 n. 3, p. 1221-1244 | - |
dc.identifier.issn | 0895-4798 | - |
dc.identifier.uri | http://hdl.handle.net/10722/261769 | - |
dc.description.abstract | We propose a new algorithm for the computation of a singular value decomposition (SVD) low-rank approximation of a matrix in the matrix product operator (MPO) format, also called the tensor train matrix format. Our tensor network randomized SVD (TNrSVD) algorithm is an MPO implementation of the randomized SVD algorithm that is able to compute dominant singular values and their corresponding singular vectors. In contrast to the state-of-the-art tensor-based alternating least squares SVD (ALS-SVD) and modified alternating least squares SVD (MALS-SVD) matrix approximation methods, TNrSVD can be up to 13 times faster while achieving better accuracy. In addition, our TNrSVD algorithm also produces accurate approximations in particular cases where both ALS-SVD and MALS-SVD fail to converge. We also propose a new algorithm for the fast conversion of a sparse matrix into its corresponding MPO form, which is up to 509 times faster than the standard tensor train SVD method while achieving machine precision accuracy. The efficiency and accuracy of both algorithms are demonstrated in numerical experiments. | - |
dc.language | eng | - |
dc.publisher | Society for Industrial and Applied Mathematics. The Journal's web site is located at http://www.siam.org/journals/simax.php | - |
dc.relation.ispartof | SIAM Journal on Matrix Analysis and Applications | - |
dc.rights | © 2018 Society for Industrial and Applied Mathematics. First Published in SIAM Journal on Matrix Analysis and Applications in volume 39, issue 3, published by the Society for Industrial and Applied Mathematics (SIAM). | - |
dc.subject | Curse of dimensionality | - |
dc.subject | Low-rank tensor approximation | - |
dc.subject | Matrix factorization | - |
dc.subject | Matrix product operator | - |
dc.subject | Randomized algorithm | - |
dc.title | Computing low-rank approximations of large-scale matrices with the Tensor Network randomized SVD | - |
dc.type | Article | - |
dc.identifier.email | Batselier, K: kbatseli@HKUCC-COM.hku.hk | - |
dc.identifier.email | Wong, N: nwong@eee.hku.hk | - |
dc.identifier.authority | Wong, N=rp00190 | - |
dc.description.nature | published_or_final_version | - |
dc.identifier.doi | 10.1137/17M1140480 | - |
dc.identifier.scopus | eid_2-s2.0-85053602057 | - |
dc.identifier.hkuros | 292467 | - |
dc.identifier.volume | 39 | - |
dc.identifier.issue | 3 | - |
dc.identifier.spage | 1221 | - |
dc.identifier.epage | 1244 | - |
dc.identifier.isi | WOS:000453716400008 | - |
dc.publisher.place | United States | - |
dc.identifier.issnl | 0895-4798 | - |