File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Appears in Collections:
Article: Boosted Kernel Ridge Regression: Optimal Learning Rates and Early Stopping
Title | Boosted Kernel Ridge Regression: Optimal Learning Rates and Early Stopping |
---|---|
Authors | |
Issue Date | 1-Feb-2019 |
Publisher | Journal of Machine Learning Research |
Citation | Journal of Machine Learning Research, 2019, v. 20, n. 46 How to Cite? |
Abstract | In this paper, we introduce a learning algorithm, boosted kernel ridge regression (BKRR), that combines L2-Boosting with the kernel ridge regression (KRR). We analyze the learning performance of this algorithm in the framework of learning theory. We show that BKRR provides a new bias-variance trade-off via tuning the number of boosting iterations, which is different from KRR via adjusting the regularization parameter. A (semi-)exponential bias-variance trade-off is derived for BKRR, exhibiting a stable relationship between the generalization error and the number of iterations. Furthermore, an adaptive stopping rule is proposed, with which BKRR achieves the optimal learning rate without saturation. |
Persistent Identifier | http://hdl.handle.net/10722/354528 |
ISSN | 2023 Impact Factor: 4.3 2023 SCImago Journal Rankings: 2.796 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Lin, Shao-Bo | - |
dc.contributor.author | Lei, Yunwen | - |
dc.contributor.author | Zhou, Ding-Xuan | - |
dc.date.accessioned | 2025-02-12T00:35:17Z | - |
dc.date.available | 2025-02-12T00:35:17Z | - |
dc.date.issued | 2019-02-01 | - |
dc.identifier.citation | Journal of Machine Learning Research, 2019, v. 20, n. 46 | - |
dc.identifier.issn | 1532-4435 | - |
dc.identifier.uri | http://hdl.handle.net/10722/354528 | - |
dc.description.abstract | <p>In this paper, we introduce a learning algorithm, boosted kernel ridge regression (BKRR), that combines L2-Boosting with the kernel ridge regression (KRR). We analyze the learning performance of this algorithm in the framework of learning theory. We show that BKRR provides a new bias-variance trade-off via tuning the number of boosting iterations, which is different from KRR via adjusting the regularization parameter. A (semi-)exponential bias-variance trade-off is derived for BKRR, exhibiting a stable relationship between the generalization error and the number of iterations. Furthermore, an adaptive stopping rule is proposed, with which BKRR achieves the optimal learning rate without saturation. <br></p> | - |
dc.language | eng | - |
dc.publisher | Journal of Machine Learning Research | - |
dc.relation.ispartof | Journal of Machine Learning Research | - |
dc.title | Boosted Kernel Ridge Regression: Optimal Learning Rates and Early Stopping | - |
dc.type | Article | - |
dc.identifier.volume | 20 | - |
dc.identifier.issue | 46 | - |
dc.identifier.eissn | 1533-7928 | - |
dc.identifier.issnl | 1532-4435 | - |