Better Bootstrap Confidence Intervals for Regression Curve Estimation
- 1 January 1995
- journal article
- research article
- Published by Taylor & Francis in Statistics
- Vol. 26 (4) , 287-306
- https://doi.org/10.1080/02331889508802498
Abstract
Bootstrap methods in curve estimation have been introduced for smoothing parameter selection and for construction of confidence intervals. Most of the papers on confidence intervals use explicit bias estimation or the technique of "undersmoothing" to deal with bias. Coverage accuracy has only been considered for curve estimates with constant variance function. In this paper we show that explicit bias estimation even with heteroscedastic variance structure leads to an improvement of coverage accuracy, when compared to undersmoothing. Bootstrapping with this bias correction using the so-called wild bootstrap leads to an improved coverage accuracy.Keywords
All Related Versions
This publication has 11 references indexed in Scilit:
- On Bootstrap Confidence Intervals in Nonparametric RegressionThe Annals of Statistics, 1992
- On Bootstrapping Kernel Spectral EstimatesThe Annals of Statistics, 1992
- The Bootstrap and Edgeworth ExpansionPublished by Springer Nature ,1992
- Rate of Convergence for the Wild Bootstrap in Nonparametric RegressionThe Annals of Statistics, 1991
- Bootstrap Simultaneous Error Bars for Nonparametric RegressionThe Annals of Statistics, 1991
- Edgeworth expansions for nonparametric density estimators, with applicationsStatistics, 1991
- Applied Nonparametric RegressionPublished by Cambridge University Press (CUP) ,1990
- Asymptotic Techniques for Use in StatisticsPublished by Springer Nature ,1989
- Bootstrapping in Nonparametric Regression: Local Adaptive Smoothing and Confidence BandsJournal of the American Statistical Association, 1988
- Sums of Independent Random VariablesPublished by Springer Nature ,1975