File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)

Article: Multilinear multitask learning by transformed tensor singular value decomposition

TitleMultilinear multitask learning by transformed tensor singular value decomposition
Authors
Issue Date24-Jun-2023
PublisherElsevier Ltd.
Citation
Machine Learning with Applications, 2023, v. 13 How to Cite?
Abstract

In this paper, we study the problem of multilinear multitask learning (MLMTL), in which all tasks are stacked into a third-order tensor for consideration. In contrast to conventional multitask learning, MLMTL can explore inherent correlations among multiple tasks in a better manner by utilizing multilinear low rank structure. Existing approaches about MLMTL are mainly based on the sum of singular values for approximating low rank matrices obtained by matricizing the third-order tensor. However, these methods are suboptimal in the Tucker rank approximation. In order to elucidate intrinsic correlations among multiple tasks, we present a new approach by the use of transformed tensor nuclear norm (TTNN) constraint in the objective function. The main advantage of the proposed approach is that it can acquire a low transformed multi-rank structure in a transformed tensor by applying suitable unitary transformations which is helpful to determine principal components in grouping multiple tasks for describing their intrinsic correlations more precisely. Furthermore, we establish an excess risk bound of the minimizer of the proposed TTNN approach. Experimental results including synthetic problems and real-world images, show that the mean-square errors of the proposed method is lower than those of the existing methods for different number of tasks and training samples in MLMTL.


Persistent Identifierhttp://hdl.handle.net/10722/331061
ISSN

 

DC FieldValueLanguage
dc.contributor.authorZhang, Xiongjun-
dc.contributor.authorWu, Jin-
dc.contributor.authorNg, Michael K-
dc.date.accessioned2023-09-21T06:52:26Z-
dc.date.available2023-09-21T06:52:26Z-
dc.date.issued2023-06-24-
dc.identifier.citationMachine Learning with Applications, 2023, v. 13-
dc.identifier.issn2666-8270-
dc.identifier.urihttp://hdl.handle.net/10722/331061-
dc.description.abstract<p>In this paper, we study the problem of multilinear <a href="https://www.sciencedirect.com/topics/computer-science/multitask-learning" title="Learn more about multitask learning from ScienceDirect's AI-generated Topic Pages">multitask learning</a> (MLMTL), in which all tasks are stacked into a third-order tensor for consideration. In contrast to conventional multitask learning, MLMTL can explore inherent correlations among multiple tasks in a better manner by utilizing multilinear low rank structure. Existing approaches about MLMTL are mainly based on the sum of <a href="https://www.sciencedirect.com/topics/computer-science/singular-value" title="Learn more about singular values from ScienceDirect's AI-generated Topic Pages">singular values</a> for approximating low rank matrices obtained by matricizing the third-order tensor. However, these methods are suboptimal in the Tucker rank approximation. In order to elucidate intrinsic correlations among multiple tasks, we present a new approach by the use of transformed tensor <a href="https://www.sciencedirect.com/topics/computer-science/nuclear-norm" title="Learn more about nuclear norm from ScienceDirect's AI-generated Topic Pages">nuclear norm</a> (TTNN) constraint in the objective function. The main advantage of the proposed approach is that it can acquire a low transformed multi-rank structure in a transformed tensor by applying suitable <a href="https://www.sciencedirect.com/topics/computer-science/unitary-transformation" title="Learn more about unitary transformations from ScienceDirect's AI-generated Topic Pages">unitary transformations</a> which is helpful to determine principal components in grouping multiple tasks for describing their intrinsic correlations more precisely. Furthermore, we establish an excess risk bound of the minimizer of the proposed TTNN approach. Experimental results including synthetic problems and real-world images, show that the mean-square errors of the proposed method is lower than those of the existing methods for different number of tasks and training samples in MLMTL.<br></p>-
dc.languageeng-
dc.publisherElsevier Ltd.-
dc.relation.ispartofMachine Learning with Applications-
dc.titleMultilinear multitask learning by transformed tensor singular value decomposition-
dc.typeArticle-
dc.identifier.doi10.1016/j.mlwa.2023.100479-
dc.identifier.volume13-
dc.identifier.issnl2666-8270-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats