File Download
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/TSP.2022.3164200
- Scopus: eid_2-s2.0-85127504292
- WOS: WOS:000788983000008
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: Towards Flexible Sparsity-Aware Modeling: Automatic Tensor Rank Learning Using the Generalized Hyperbolic Prior
Title | Towards Flexible Sparsity-Aware Modeling: Automatic Tensor Rank Learning Using the Generalized Hyperbolic Prior |
---|---|
Authors | |
Keywords | Automatic tensor rank learning Bayesian learning generalized hyperbolic distribution tensor CPD variational inference |
Issue Date | 1-Apr-2022 |
Publisher | Institute of Electrical and Electronics Engineers |
Citation | IEEE Transactions on Signal Processing, 2022, v. 70, p. 1834-1849 How to Cite? |
Abstract | Tensor rank learning for canonical polyadic decomposition (CPD) has long been deemed as an essential yet challenging problem. In particular, since thetensor rank controls the complexity of the CPD model, its inaccurate learning would cause overfitting to noise or underfitting to the signal sources, and even destroy the interpretability of model parameters. However, the optimal determination of a tensor rank is known to be a non-deterministic polynomial-time hard (NP-hard) task. Rather than exhaustively searching for the best tensor rank via trial-and-error experiments, Bayesian inference under the Gaussian-gamma prior was introduced in the context of probabilistic CPD modeling, and it was shown to be an effective strategy for automatic tensor rank determination. This triggered flourishing research on other structured tensor CPDs with automatic tensor rank learning. On the other side of the coin, these research works also reveal that the Gaussian-gamma model does not perform well for high-rank tensors and/or low signal-to-noise ratios (SNRs). To overcome these drawbacks, in this paper, we introduce a more advanced generalized hyperbolic (GH) prior to the probabilistic CPD model, which not only includes the Gaussian-gamma model as a special case, but also is more flexible to adapt to different levels of sparsity. Based on this novel probabilistic model, an algorithm is developed under the framework of variational inference, where each update is obtained in a closed-form. Extensive numerical results, using synthetic data and real-world datasets, demonstrate the significantly improved performance of the proposed method in learning both low as well as high tensor ranks even for low SNR cases. |
Persistent Identifier | http://hdl.handle.net/10722/339296 |
ISSN | 2023 Impact Factor: 4.6 2023 SCImago Journal Rankings: 2.520 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Cheng, Lei | - |
dc.contributor.author | Chen, Zhongtao | - |
dc.contributor.author | Shi, Qingjiang | - |
dc.contributor.author | Wu, Yik-Chung | - |
dc.contributor.author | Theodoridis, Sergios | - |
dc.date.accessioned | 2024-03-11T10:35:30Z | - |
dc.date.available | 2024-03-11T10:35:30Z | - |
dc.date.issued | 2022-04-01 | - |
dc.identifier.citation | IEEE Transactions on Signal Processing, 2022, v. 70, p. 1834-1849 | - |
dc.identifier.issn | 1053-587X | - |
dc.identifier.uri | http://hdl.handle.net/10722/339296 | - |
dc.description.abstract | <p>Tensor rank learning for canonical polyadic decomposition (CPD) has long been deemed as an essential yet challenging problem. In particular, since thetensor rank controls the complexity of the CPD model, its inaccurate learning would cause overfitting to noise or underfitting to the signal sources, and even destroy the interpretability of model parameters. However, the optimal determination of a tensor rank is known to be a non-deterministic polynomial-time hard (NP-hard) task. Rather than exhaustively searching for the best tensor rank via trial-and-error experiments, Bayesian inference under the Gaussian-gamma prior was introduced in the context of probabilistic CPD modeling, and it was shown to be an effective strategy for automatic tensor rank determination. This triggered flourishing research on other structured tensor CPDs with automatic tensor rank learning. On the other side of the coin, these research works also reveal that the Gaussian-gamma model does not perform well for high-rank tensors and/or low signal-to-noise ratios (SNRs). To overcome these drawbacks, in this paper, we introduce a more advanced generalized hyperbolic (GH) prior to the probabilistic CPD model, which not only includes the Gaussian-gamma model as a special case, but also is more flexible to adapt to different levels of sparsity. Based on this novel probabilistic model, an algorithm is developed under the framework of variational inference, where each update is obtained in a closed-form. Extensive numerical results, using synthetic data and real-world datasets, demonstrate the significantly improved performance of the proposed method in learning both low as well as high tensor ranks even for low SNR cases.<br></p> | - |
dc.language | eng | - |
dc.publisher | Institute of Electrical and Electronics Engineers | - |
dc.relation.ispartof | IEEE Transactions on Signal Processing | - |
dc.rights | This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. | - |
dc.subject | Automatic tensor rank learning | - |
dc.subject | Bayesian learning | - |
dc.subject | generalized hyperbolic distribution | - |
dc.subject | tensor CPD | - |
dc.subject | variational inference | - |
dc.title | Towards Flexible Sparsity-Aware Modeling: Automatic Tensor Rank Learning Using the Generalized Hyperbolic Prior | - |
dc.type | Article | - |
dc.description.nature | published_or_final_version | - |
dc.identifier.doi | 10.1109/TSP.2022.3164200 | - |
dc.identifier.scopus | eid_2-s2.0-85127504292 | - |
dc.identifier.volume | 70 | - |
dc.identifier.spage | 1834 | - |
dc.identifier.epage | 1849 | - |
dc.identifier.eissn | 1941-0476 | - |
dc.identifier.isi | WOS:000788983000008 | - |
dc.identifier.issnl | 1053-587X | - |