File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1016/j.csda.2016.07.012
- Scopus: eid_2-s2.0-84983470998
- WOS: WOS:000385604500010
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: Asymptotically Optimal Differenced Estimators of Error Variance in Nonparametric Regression
Title | Asymptotically Optimal Differenced Estimators of Error Variance in Nonparametric Regression |
---|---|
Authors | |
Keywords | Bias correction Difference order Error estimation Kernel estimation Optimal difference sequence Quadratic form Taylor expansion |
Issue Date | 2017 |
Publisher | Elsevier BV. The Journal's web site is located at http://www.elsevier.com/locate/csda |
Citation | Computational Statistics & Data Analysis, 2017, v. 105, p. 125-143 How to Cite? |
Abstract | The existing differenced estimators of error variance in nonparametric regression are interpreted as kernel estimators, and some requirements for a ``good' estimator of error variance are specified. A new differenced method is then proposed that estimates the errors as the intercepts in a sequence of simple linear regressions and constructs a variance estimator based on estimated errors. The new estimator satisfies the requirements for a ``good' estimator and achieves the asymptotically optimal mean square error. A feasible difference order is also derived, which makes the estimator more applicable. To improve the finite-sample performance, two bias-corrected versions are further proposed. All three estimators are equivalent to some local polynomial estimators and thus can be interpreted as kernel estimators. To determine which of the three estimators to be used in practice, a rule of thumb is provided by analysis of the mean square error, which solves an open problem in error variance estimation which difference sequence to be used in finite samples. Simulation studies and a real data application corroborate the theoretical results and illustrate the advantages of the new method compared with the existing methods. |
Persistent Identifier | http://hdl.handle.net/10722/229650 |
ISSN | 2023 Impact Factor: 1.5 2023 SCImago Journal Rankings: 1.008 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Wang, W | - |
dc.contributor.author | Yu, P | - |
dc.date.accessioned | 2016-08-23T14:12:25Z | - |
dc.date.available | 2016-08-23T14:12:25Z | - |
dc.date.issued | 2017 | - |
dc.identifier.citation | Computational Statistics & Data Analysis, 2017, v. 105, p. 125-143 | - |
dc.identifier.issn | 0167-9473 | - |
dc.identifier.uri | http://hdl.handle.net/10722/229650 | - |
dc.description.abstract | The existing differenced estimators of error variance in nonparametric regression are interpreted as kernel estimators, and some requirements for a ``good' estimator of error variance are specified. A new differenced method is then proposed that estimates the errors as the intercepts in a sequence of simple linear regressions and constructs a variance estimator based on estimated errors. The new estimator satisfies the requirements for a ``good' estimator and achieves the asymptotically optimal mean square error. A feasible difference order is also derived, which makes the estimator more applicable. To improve the finite-sample performance, two bias-corrected versions are further proposed. All three estimators are equivalent to some local polynomial estimators and thus can be interpreted as kernel estimators. To determine which of the three estimators to be used in practice, a rule of thumb is provided by analysis of the mean square error, which solves an open problem in error variance estimation which difference sequence to be used in finite samples. Simulation studies and a real data application corroborate the theoretical results and illustrate the advantages of the new method compared with the existing methods. | - |
dc.language | eng | - |
dc.publisher | Elsevier BV. The Journal's web site is located at http://www.elsevier.com/locate/csda | - |
dc.relation.ispartof | Computational Statistics & Data Analysis | - |
dc.rights | Posting accepted manuscript (postprint): © <year>. This manuscript version is made available under the CC-BY-NC-ND 4.0 license http://creativecommons.org/licenses/by-nc-nd/4.0/ | - |
dc.subject | Bias correction | - |
dc.subject | Difference order | - |
dc.subject | Error estimation | - |
dc.subject | Kernel estimation | - |
dc.subject | Optimal difference sequence | - |
dc.subject | Quadratic form | - |
dc.subject | Taylor expansion | - |
dc.title | Asymptotically Optimal Differenced Estimators of Error Variance in Nonparametric Regression | - |
dc.type | Article | - |
dc.identifier.email | Yu, P: pingyu@hku.hk | - |
dc.identifier.authority | Yu, P=rp01941 | - |
dc.identifier.doi | 10.1016/j.csda.2016.07.012 | - |
dc.identifier.scopus | eid_2-s2.0-84983470998 | - |
dc.identifier.hkuros | 262450 | - |
dc.identifier.volume | 105 | - |
dc.identifier.spage | 125 | - |
dc.identifier.epage | 143 | - |
dc.identifier.isi | WOS:000385604500010 | - |
dc.publisher.place | Netherlands | - |
dc.identifier.issnl | 0167-9473 | - |