File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Generalization ability of fractional polynomial models

TitleGeneralization ability of fractional polynomial models
Authors
KeywordsApproximation theory
Fractional polynomial
Learning algorithm
Learning theory
Model selection
Issue Date2014
Citation
Neural Networks, 2014, v. 49, p. 59-73 How to Cite?
AbstractIn this paper, the problem of learning the functional dependency between input and output variables from scattered data using fractional polynomial models(FPM) is investigated. The estimation error bounds are obtained by calculating the pseudo-dimension of FPM, which is shown to be equal to that of sparse polynomial models(SPM). A linear decay of the approximation error is obtained for a class of target functions which are dense in the space of continuous functions. We derive a structural risk analogous to the Schwartz Criterion and demonstrate theoretically that the model minimizing this structural risk can achieve a favorable balance between estimation and approximation errors. An empirical model selection comparison is also performed to justify the usage of this structural risk in selecting the optimal complexity index from the data. We show that the construction of FPM can be efficiently addressed by the variable projection method. Furthermore, our empirical study implies that FPM could attain better generalization performance when compared with SPM and cubic splines. © 2013 Elsevier Ltd.
Persistent Identifierhttp://hdl.handle.net/10722/329821
ISSN
2023 Impact Factor: 6.0
2023 SCImago Journal Rankings: 2.605
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorLei, Yunwen-
dc.contributor.authorDing, Lixin-
dc.contributor.authorDing, Yiming-
dc.date.accessioned2023-08-09T03:35:34Z-
dc.date.available2023-08-09T03:35:34Z-
dc.date.issued2014-
dc.identifier.citationNeural Networks, 2014, v. 49, p. 59-73-
dc.identifier.issn0893-6080-
dc.identifier.urihttp://hdl.handle.net/10722/329821-
dc.description.abstractIn this paper, the problem of learning the functional dependency between input and output variables from scattered data using fractional polynomial models(FPM) is investigated. The estimation error bounds are obtained by calculating the pseudo-dimension of FPM, which is shown to be equal to that of sparse polynomial models(SPM). A linear decay of the approximation error is obtained for a class of target functions which are dense in the space of continuous functions. We derive a structural risk analogous to the Schwartz Criterion and demonstrate theoretically that the model minimizing this structural risk can achieve a favorable balance between estimation and approximation errors. An empirical model selection comparison is also performed to justify the usage of this structural risk in selecting the optimal complexity index from the data. We show that the construction of FPM can be efficiently addressed by the variable projection method. Furthermore, our empirical study implies that FPM could attain better generalization performance when compared with SPM and cubic splines. © 2013 Elsevier Ltd.-
dc.languageeng-
dc.relation.ispartofNeural Networks-
dc.subjectApproximation theory-
dc.subjectFractional polynomial-
dc.subjectLearning algorithm-
dc.subjectLearning theory-
dc.subjectModel selection-
dc.titleGeneralization ability of fractional polynomial models-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1016/j.neunet.2013.09.009-
dc.identifier.pmid24140985-
dc.identifier.scopuseid_2-s2.0-84886078527-
dc.identifier.volume49-
dc.identifier.spage59-
dc.identifier.epage73-
dc.identifier.eissn1879-2782-
dc.identifier.isiWOS:000331130000008-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats