Resampling Algorithm

From GM-RKB
(Redirected from resampling method)
Jump to navigation Jump to search

A Resampling Algorithm is an estimation algorithm that is based on population samples.



References

2013

2008

2003

  • (Howell, 2003) ⇒ David C. Howell. (2003). http://www.uvm.edu/~dhowell/StatPages/Resampling/
  • (Lahiri, 2003) ⇒ Soumendra N. Lahiri. (2003). “Resampling Methods for Dependent Data." Springer
    • Book overview: This book gives a detailed account of bootstrap methods and their properties for dependent data, covering a wide range of topics such as block bootstrap methods, bootstrap methods in the frequency domain, resampling methods for long range dependent data, and resampling methods for spatial data. The first five chapters of the book treat the theory and applications of block bootstrap methods at the level of a graduate text. The rest of the book is written as a research monograph, with frequent references to the literature, but mostly at a level accessible to graduate students familiar with basic concepts in statistics. Supplemental background material is added in the discussion of such important issues as second order properties of bootstrap methods, bootstrap under long range dependence, and bootstrap for extremes and heavy tailed dependent data. Further, illustrative numerical examples are given all through the book and issues involving application of the methodology are discussed. The book fills a gap in the literature covering research on resampling methods for dependent data that has witnessed vigorous growth over the last two decades but remains scattered in various statistics and econometrics journals. It can be used as a graduate level text for a special topics course on resampling methods for dependent data and also as a research monograph for statisticians and econometricians who want to learn more about the topic and want to apply the methods in their own research. S.N. Lahiri is a professor of Statistics at the Iowa State University, is a Fellow of the Institute of Mathematical Statistics and a Fellow of the American Statistical Association.
    • Keywords_: random variables, variogram, M-estimator, sample mean, Smooth Function, random vectors, autoregressive process, block size, long-range dependence, stationary process, spectral density, sampling distribution, periodogram, conditional distribution, converges in distribution, Lebesgue measure, covariance matrix, probability measures, asymptotic variance, Gaussian process

1995

1993

1982

  • (Efron, 1982) ⇒ Bradley Efron. (1982). “The Jacknife, the Bootstrap and Other Resampling Plans.” CBMS-NSF Regional conference series in applied mathematics

1979

  • (Efron, 1979) ⇒ Bradley Efron. (1979). “Bootstrap Methods: Another Look at the Jackknife.” In: The Annals of Statistics, 7(1). http://www.jstor.org/stable/2958830
    • http://books.google.com/books?id=-uJ_auimaYkC&pg=PA569
    • ABSTRACT: We discuss the following problem: given a random sample X = (X1, X2, ..., Xn) from an unknown probability distribution F, estimate the sampling distribution of some prespecified random variable R(X, F), on the basis of the observed data x. (Standard jackknife theory gives an approximate mean and variance in the case R(X, F) = θ(F^) - θ(F), θ some parameter of interest.) A general method, called the "bootstrap," is introduced, and shown to work satisfactorily on a variety of estimation problems. The jackknife is shown to be a linear approximation method for the bootstrap. The exposition proceeds by a series of examples: variance of the sample median, error rates in a linear discriminant analysis, ratio estimation, estimating regression parameters, etc.