File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1016/j.neucom.2013.03.033
- Scopus: eid_2-s2.0-84881553680
- WOS: WOS:000323851800031
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: Universal learning using free multivariate splines
Title | Universal learning using free multivariate splines |
---|---|
Authors | |
Keywords | Complexity regularization Free multivariate spline Learning theory Rademacher complexity |
Issue Date | 2013 |
Citation | Neurocomputing, 2013, v. 119, p. 253-263 How to Cite? |
Abstract | This paper discusses the problem of universal learning using free multivariate splines of order 1. Universal means that the learning algorithm does not involve a priori assumption on the regularity of the target function. We characterize the complexity of the space of free multivariate splines by the remarkable notion called Rademacher complexity, based on which a penalized empirical risk is constructed as an estimation of the expected risk for the candidate model. Our Rademacher complexity bounds are tight within a logarithmic factor. It is shown that the prediction rule minimizing the penalized empirical risk achieves a favorable balance between the approximation and estimation error. By resorting to the powerful techniques in approximation theory to approach the approximation error, we also derive bounds on the generalization error in terms of the sample size, for a large class of loss functions. © 2013 Elsevier B.V. |
Persistent Identifier | http://hdl.handle.net/10722/329282 |
ISSN | 2023 Impact Factor: 5.5 2023 SCImago Journal Rankings: 1.815 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Lei, Yunwen | - |
dc.contributor.author | Ding, Lixin | - |
dc.contributor.author | Wu, Weili | - |
dc.date.accessioned | 2023-08-09T03:31:41Z | - |
dc.date.available | 2023-08-09T03:31:41Z | - |
dc.date.issued | 2013 | - |
dc.identifier.citation | Neurocomputing, 2013, v. 119, p. 253-263 | - |
dc.identifier.issn | 0925-2312 | - |
dc.identifier.uri | http://hdl.handle.net/10722/329282 | - |
dc.description.abstract | This paper discusses the problem of universal learning using free multivariate splines of order 1. Universal means that the learning algorithm does not involve a priori assumption on the regularity of the target function. We characterize the complexity of the space of free multivariate splines by the remarkable notion called Rademacher complexity, based on which a penalized empirical risk is constructed as an estimation of the expected risk for the candidate model. Our Rademacher complexity bounds are tight within a logarithmic factor. It is shown that the prediction rule minimizing the penalized empirical risk achieves a favorable balance between the approximation and estimation error. By resorting to the powerful techniques in approximation theory to approach the approximation error, we also derive bounds on the generalization error in terms of the sample size, for a large class of loss functions. © 2013 Elsevier B.V. | - |
dc.language | eng | - |
dc.relation.ispartof | Neurocomputing | - |
dc.subject | Complexity regularization | - |
dc.subject | Free multivariate spline | - |
dc.subject | Learning theory | - |
dc.subject | Rademacher complexity | - |
dc.title | Universal learning using free multivariate splines | - |
dc.type | Article | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1016/j.neucom.2013.03.033 | - |
dc.identifier.scopus | eid_2-s2.0-84881553680 | - |
dc.identifier.volume | 119 | - |
dc.identifier.spage | 253 | - |
dc.identifier.epage | 263 | - |
dc.identifier.eissn | 1872-8286 | - |
dc.identifier.isi | WOS:000323851800031 | - |