File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Conference Paper: Fine-grained Generalization Analysis of Vector-valued Learning
Title | Fine-grained Generalization Analysis of Vector-valued Learning |
---|---|
Authors | |
Issue Date | 2021 |
Citation | 35th AAAI Conference on Artificial Intelligence, AAAI 2021, 2021, v. 12A, p. 10338-10346 How to Cite? |
Abstract | Many fundamental machine learning tasks can be formulated as a problem of learning with vector-valued functions, where we learn multiple scalar-valued functions together. Although there is some generalization analysis on different specific algorithms under the empirical risk minimization principle, a unifying analysis of vector-valued learning under a regularization framework is still lacking. In this paper, we initiate the generalization analysis of regularized vector-valued learning algorithms by presenting bounds with a mild dependency on the output dimension and a fast rate on the sample size. Our discussions relax the existing assumptions on the restrictive constraint of hypothesis spaces, smoothness of loss functions and low-noise condition. To understand the interaction between optimization and learning, we further use our results to derive the first generalization bounds for stochastic gradient descent with vector-valued functions. We apply our general results to multi-class classification and multi-label classification, which yield the first bounds with a logarithmic dependency on the output dimension for extreme multi-label classification with the Frobenius regularization. As a byproduct, we derive a Rademacher complexity bound for loss function classes defined in terms of a general strongly convex function. |
Persistent Identifier | http://hdl.handle.net/10722/329722 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Wu, Liang | - |
dc.contributor.author | Ledent, Antoine | - |
dc.contributor.author | Lei, Yunwen | - |
dc.contributor.author | Kloft, Marius | - |
dc.date.accessioned | 2023-08-09T03:34:52Z | - |
dc.date.available | 2023-08-09T03:34:52Z | - |
dc.date.issued | 2021 | - |
dc.identifier.citation | 35th AAAI Conference on Artificial Intelligence, AAAI 2021, 2021, v. 12A, p. 10338-10346 | - |
dc.identifier.uri | http://hdl.handle.net/10722/329722 | - |
dc.description.abstract | Many fundamental machine learning tasks can be formulated as a problem of learning with vector-valued functions, where we learn multiple scalar-valued functions together. Although there is some generalization analysis on different specific algorithms under the empirical risk minimization principle, a unifying analysis of vector-valued learning under a regularization framework is still lacking. In this paper, we initiate the generalization analysis of regularized vector-valued learning algorithms by presenting bounds with a mild dependency on the output dimension and a fast rate on the sample size. Our discussions relax the existing assumptions on the restrictive constraint of hypothesis spaces, smoothness of loss functions and low-noise condition. To understand the interaction between optimization and learning, we further use our results to derive the first generalization bounds for stochastic gradient descent with vector-valued functions. We apply our general results to multi-class classification and multi-label classification, which yield the first bounds with a logarithmic dependency on the output dimension for extreme multi-label classification with the Frobenius regularization. As a byproduct, we derive a Rademacher complexity bound for loss function classes defined in terms of a general strongly convex function. | - |
dc.language | eng | - |
dc.relation.ispartof | 35th AAAI Conference on Artificial Intelligence, AAAI 2021 | - |
dc.title | Fine-grained Generalization Analysis of Vector-valued Learning | - |
dc.type | Conference_Paper | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.scopus | eid_2-s2.0-85108779458 | - |
dc.identifier.volume | 12A | - |
dc.identifier.spage | 10338 | - |
dc.identifier.epage | 10346 | - |