File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: On least squares estimation for stable nonlinear AR processes

TitleOn least squares estimation for stable nonlinear AR processes
Authors
KeywordsLaw of the iterated logarithm
Least squares estimation
Model selection
Multilayer perceptron
Nonlinear AR process
Issue Date2000
PublisherSpringer Verlag
Citation
Annals Of The Institute Of Statistical Mathematics, 2000, v. 52 n. 2, p. 316-331 How to Cite?
AbstractFollowing a Markov chain approach, this paper establishes asymptotic properties of the least squares estimator in nonlinear autoregressive (NAR) models. Based on conditions ensuring the stability of the model and allowing the use of a strong law of large number for a wide class of functions, our approach improves some known results on strong consistency and asymptotic normality of the estimator. The exact convergence rate is established by a law of the iterated logarithm. Based on this law and a generalized Akaike's information criterion, we build a strongly consistent procedure for selection of NAR models. Detailed results are given for familiar nonlinear AR models like exponential AR models, threshold models or multilayer feedforward perceptrons.
Persistent Identifierhttp://hdl.handle.net/10722/132632
ISSN
2015 Impact Factor: 0.768
2015 SCImago Journal Rankings: 0.931
References

 

DC FieldValueLanguage
dc.contributor.authorYao, JFen_HK
dc.date.accessioned2011-03-28T09:27:07Z-
dc.date.available2011-03-28T09:27:07Z-
dc.date.issued2000en_HK
dc.identifier.citationAnnals Of The Institute Of Statistical Mathematics, 2000, v. 52 n. 2, p. 316-331en_HK
dc.identifier.issn0020-3157en_HK
dc.identifier.urihttp://hdl.handle.net/10722/132632-
dc.description.abstractFollowing a Markov chain approach, this paper establishes asymptotic properties of the least squares estimator in nonlinear autoregressive (NAR) models. Based on conditions ensuring the stability of the model and allowing the use of a strong law of large number for a wide class of functions, our approach improves some known results on strong consistency and asymptotic normality of the estimator. The exact convergence rate is established by a law of the iterated logarithm. Based on this law and a generalized Akaike's information criterion, we build a strongly consistent procedure for selection of NAR models. Detailed results are given for familiar nonlinear AR models like exponential AR models, threshold models or multilayer feedforward perceptrons.en_HK
dc.languageengen_US
dc.publisherSpringer Verlagen_US
dc.relation.ispartofAnnals of the Institute of Statistical Mathematicsen_HK
dc.subjectLaw of the iterated logarithmen_HK
dc.subjectLeast squares estimationen_HK
dc.subjectModel selectionen_HK
dc.subjectMultilayer perceptronen_HK
dc.subjectNonlinear AR processen_HK
dc.titleOn least squares estimation for stable nonlinear AR processesen_HK
dc.typeArticleen_HK
dc.identifier.emailYao, JF: jeffyao@hku.hken_HK
dc.identifier.authorityYao, JF=rp01473en_HK
dc.description.naturelink_to_subscribed_fulltexten_US
dc.identifier.scopuseid_2-s2.0-6744265300en_HK
dc.relation.referenceshttp://www.scopus.com/mlt/select.url?eid=2-s2.0-6744265300&selection=ref&src=s&origin=recordpageen_HK
dc.identifier.volume52en_HK
dc.identifier.issue2en_HK
dc.identifier.spage316en_HK
dc.identifier.epage331en_HK
dc.publisher.placeGermanyen_HK
dc.identifier.scopusauthoridYao, JF=7403503451en_HK

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats