The jackknife and bootstrap are the most popular data-resampling meth­ ods used in statistical analysis. The resampling methods replace theoreti­ cal derivations required in applying traditional methods (such as substitu­ tion and linearization) in statistical analysis by repeatedly resampling the original data and making inferences from the resamples. Because of the availability of inexpensive and fast computing, these computer-intensive methods have caught on very rapidly in recent years and are particularly appreciated by applied statisticians. The primary aims of this book are (1) to provide a systematic introduction to the theory of the jackknife, the bootstrap, and other resampling methods developed in the last twenty years; (2) to provide a guide for applied statisticians: practitioners often use (or misuse) the resampling methods in situations where no theoretical confirmation has been made; and (3) to stimulate the use of the jackknife and bootstrap and further devel­ opments of the resampling methods. The theoretical properties of the jackknife and bootstrap methods are studied in this book in an asymptotic framework. Theorems are illustrated by examples. Finite sample properties of the jackknife and bootstrap are mostly investigated by examples and/or empirical simulation studies. In addition to the theory for the jackknife and bootstrap methods in problems with independent and identically distributed (Li.d.) data, we try to cover, as much as we can, the applications of the jackknife and bootstrap in various complicated non-Li.d. data problems.
Les mer
The resampling methods replace theoreti­ cal derivations required in applying traditional methods (such as substitu­ tion and linearization) in statistical analysis by repeatedly resampling the original data and making inferences from the resamples.
Les mer
1. Introduction.- 1.1 Statistics and Their Sampling Distributions.- 1.2 The Traditional Approach.- 1.3 The Jackknife.- 1.4 The Bootstrap.- 1.5 Extensions to Complex Problems.- 1.6 Scope of Our Studies.- 2. Theory for the Jackknife.- 2.1 Variance Estimation for Functions of Means.- 2.2 Variance Estimation for Functionals.- 2.3 The Delete-d Jackknife.- 2.4 Other Applications.- 2.5 Conclusions and Discussions.- 3. Theory for the Bootstrap.- 3.1 Techniques in Proving Consistency.- 3.2 Consistency: Some Major Results.- 3.3 Accuracy and Asymptotic Comparisons.- 3.4 Fixed Sample Performance.- 3.5 Smoothed Bootstrap.- 3.6 Nonregular Cases.- 3.7 Conclusions and Discussions.- 4. Bootstrap Confidence Sets and Hypothesis Tests.- 4.1 Bootstrap Confidence Sets.- 4.2 Asymptotic Theory.- 4.3 The Iterative Bootstrap and Other Methods.- 4.4 Empirical Comparisons.- 4.5 Bootstrap Hypothesis Tests.- 4.6 Conclusions and Discussions.- 5. Computational Methods.- 5.1 The Delete-1 Jackknife.- 5.2 The Delete-d Jackknife.- 5.3 Analytic Approaches for the Bootstrap.- 5.4 Simulation Approaches for the Bootstrap.- 5.5 Conclusions and Discussions.- 6. Applications to Sample Surveys.- 6.1 Sampling Designs and Estimates.- 6.2 Resampling Methods.- 6.3 Comparisons by Simulation.- 6.4 Asymptotic Results.- 6.5 Resampling Under Imputation.- 6.6 Conclusions and Discussions.- 7. Applications to Linear Models.- 7.1 Linear Models and Regression Estimates.- 7.2 Variance and Bias Estimation.- 7.3 Inference and Prediction Using the Bootstrap.- 7.4 Model Selection.- 7.5 Asymptotic Theory.- 7.6 Conclusions and Discussions.- 8. Applications to Nonlinear, Nonparametric, and Multivariate Models.- 8.1 Nonlinear Regression.- 8.2 Generalized Linear Models.- 8.3 Cox’s Regression Models.- 8.4 Kernel Density Estimation.-8.5 Nonparametric Regression.- 8.6 Multivariate Analysis.- 8.7 Conclusions and Discussions.- 9. Applications to Time Series and Other Dependent Data.- 9.1 m-Dependent Data.- 9.2 Markov Chains.- 9.3 Autoregressive Time Series.- 9.4 Other Time Series.- 9.5 Stationary Processes.- 9.6 Conclusions and Discussions.- 10. Bayesian Bootstrap and Random Weighting.- 10.1 Bayesian Bootstrap.- 10.2 Random Weighting.- 10.3 Random Weighting for Functional and Linear Models.- 10.4 Empirical Results for Random Weighting.- 10.5 Conclusions and Discussions.- Appendix A. Asymptotic Results.- A.1 Modes of Convergence.- A.2 Convergence of Transformations.- A.4 The Borel-Cantelli Lemma.- A.5 The Law of Large Numbers.- A.6 The Law of the Iterated Logarithm.- A.7 Uniform Integrability.- A.8 The Central Limit Theorem.- A.9 The Berry-Esséen Theorem.- A.10 Edgeworth Expansions.- A.11 Cornish-Fisher Expansions.- Appendix B. Notation.- References.- Author Index.
Les mer
Springer Book Archives
Springer Book Archives

Produktdetaljer

ISBN
9781461269038
Publisert
2012-10-04
Utgiver
Vendor
Springer-Verlag New York Inc.
Høyde
235 mm
Bredde
155 mm
Aldersnivå
Research, P, 06
Språk
Product language
Engelsk
Format
Product format
Heftet