File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Article: Boosted kernel ridge regression: Optimal learning rates and early stopping
Title | Boosted kernel ridge regression: Optimal learning rates and early stopping |
---|---|
Authors | |
Keywords | Boosting Integral operator Kernel ridge regression Learning theory |
Issue Date | 2019 |
Citation | Journal of Machine Learning Research, 2019, v. 20 How to Cite? |
Abstract | In this paper, we introduce a learning algorithm, boosted kernel ridge regression (BKRR), that combines L2-Boosting with the kernel ridge regression (KRR). We analyze the learning performance of this algorithm in the framework of learning theory. We show that BKRR provides a new bias-variance trade-off via tuning the number of boosting iterations, which is different from KRR via adjusting the regularization parameter. A (semi-)exponential bias-variance trade-off is derived for BKRR, exhibiting a stable relationship between the generalization error and the number of iterations. Furthermore, an adaptive stopping rule is proposed, with which BKRR achieves the optimal learning rate without saturation. |
Persistent Identifier | http://hdl.handle.net/10722/329845 |
ISSN | 2023 Impact Factor: 4.3 2023 SCImago Journal Rankings: 2.796 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Lin, Shao Bo | - |
dc.contributor.author | Lei, Yunwen | - |
dc.contributor.author | Zhou, Ding Xuan | - |
dc.date.accessioned | 2023-08-09T03:35:45Z | - |
dc.date.available | 2023-08-09T03:35:45Z | - |
dc.date.issued | 2019 | - |
dc.identifier.citation | Journal of Machine Learning Research, 2019, v. 20 | - |
dc.identifier.issn | 1532-4435 | - |
dc.identifier.uri | http://hdl.handle.net/10722/329845 | - |
dc.description.abstract | In this paper, we introduce a learning algorithm, boosted kernel ridge regression (BKRR), that combines L2-Boosting with the kernel ridge regression (KRR). We analyze the learning performance of this algorithm in the framework of learning theory. We show that BKRR provides a new bias-variance trade-off via tuning the number of boosting iterations, which is different from KRR via adjusting the regularization parameter. A (semi-)exponential bias-variance trade-off is derived for BKRR, exhibiting a stable relationship between the generalization error and the number of iterations. Furthermore, an adaptive stopping rule is proposed, with which BKRR achieves the optimal learning rate without saturation. | - |
dc.language | eng | - |
dc.relation.ispartof | Journal of Machine Learning Research | - |
dc.subject | Boosting | - |
dc.subject | Integral operator | - |
dc.subject | Kernel ridge regression | - |
dc.subject | Learning theory | - |
dc.title | Boosted kernel ridge regression: Optimal learning rates and early stopping | - |
dc.type | Article | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.scopus | eid_2-s2.0-85072647947 | - |
dc.identifier.volume | 20 | - |
dc.identifier.eissn | 1533-7928 | - |