File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1016/j.patcog.2024.110995
- Scopus: eid_2-s2.0-85204570536
- Find via

Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Article: Revisiting trace norm minimization for tensor Tucker completion: A direct multilinear rank learning approach
| Title | Revisiting trace norm minimization for tensor Tucker completion: A direct multilinear rank learning approach |
|---|---|
| Authors | |
| Keywords | Multilinear rank Tensor decomposition Trace norm minimization Tucker model |
| Issue Date | 1-Feb-2025 |
| Publisher | Elsevier |
| Citation | Pattern Recognition, 2025, v. 158 How to Cite? |
| Abstract | To efficiently express tensor data using the Tucker format, a critical task is to minimize the multilinear rank such that the model would not be over-flexible and lead to overfitting. Due to the lack of rank minimization tools in tensor, existing works connect Tucker multilinear rank minimization to trace norm minimization of matrices unfolded from the tensor data. While these formulations try to exploit the common aim of identifying the low-dimensional structure of the tensor and matrix, this paper reveals that existing trace norm-based formulations in Tucker completion are inefficient in multilinear rank minimization. We further propose a new interpretation of Tucker format such that trace norm minimization is applied to the factor matrices of the equivalent representation, rather than some matrices unfolded from tensor data. Based on the newly established problem formulation, a fixed point iteration algorithm is proposed, and its convergence is proved. Numerical results are presented to show that the proposed algorithm exhibits significant improved performance in terms of multilinear rank learning and consequently tensor signal recovery accuracy, compared to existing trace norm based Tucker completion methods. |
| Persistent Identifier | http://hdl.handle.net/10722/361966 |
| ISSN | 2023 Impact Factor: 7.5 2023 SCImago Journal Rankings: 2.732 |
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Tong, Xueke | - |
| dc.contributor.author | Zhu, Hancheng | - |
| dc.contributor.author | Cheng, Lei | - |
| dc.contributor.author | Wu, Yik Chung | - |
| dc.date.accessioned | 2025-09-18T00:35:52Z | - |
| dc.date.available | 2025-09-18T00:35:52Z | - |
| dc.date.issued | 2025-02-01 | - |
| dc.identifier.citation | Pattern Recognition, 2025, v. 158 | - |
| dc.identifier.issn | 0031-3203 | - |
| dc.identifier.uri | http://hdl.handle.net/10722/361966 | - |
| dc.description.abstract | <p>To efficiently express tensor data using the Tucker format, a critical task is to minimize the multilinear rank such that the model would not be over-flexible and lead to overfitting. Due to the lack of rank minimization tools in tensor, existing works connect Tucker multilinear rank minimization to trace norm minimization of matrices unfolded from the tensor data. While these formulations try to exploit the common aim of identifying the low-dimensional structure of the tensor and matrix, this paper reveals that existing trace norm-based formulations in Tucker completion are inefficient in multilinear rank minimization. We further propose a new interpretation of Tucker format such that trace norm minimization is applied to the factor matrices of the equivalent representation, rather than some matrices unfolded from tensor data. Based on the newly established problem formulation, a fixed point iteration algorithm is proposed, and its convergence is proved. Numerical results are presented to show that the proposed algorithm exhibits significant improved performance in terms of multilinear rank learning and consequently tensor signal recovery accuracy, compared to existing trace norm based Tucker completion methods.</p> | - |
| dc.language | eng | - |
| dc.publisher | Elsevier | - |
| dc.relation.ispartof | Pattern Recognition | - |
| dc.rights | This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. | - |
| dc.subject | Multilinear rank | - |
| dc.subject | Tensor decomposition | - |
| dc.subject | Trace norm minimization | - |
| dc.subject | Tucker model | - |
| dc.title | Revisiting trace norm minimization for tensor Tucker completion: A direct multilinear rank learning approach | - |
| dc.type | Article | - |
| dc.identifier.doi | 10.1016/j.patcog.2024.110995 | - |
| dc.identifier.scopus | eid_2-s2.0-85204570536 | - |
| dc.identifier.volume | 158 | - |
| dc.identifier.eissn | 1873-5142 | - |
| dc.identifier.issnl | 0031-3203 | - |
