Monte Carlo and Bootstrap Methods

8 videos • 1,076 views • by Fourth Z This is Playlist 7 in the Nonparametric Statistics for the Behavioral, Social and Medical Sciences series. The general idea of Monte Carlo and bootstrap methods is to use a computer to simulate building a sampling distribution of a statistic. Sampling distributions form the foundation for statistical inference, so these simulation methods lead to a new way of thinking about making inferences. We can study the properties of statistical estimators with various distributional forms, we can see where exact methods will lead even when they are too large to be feasible, and we can generate populations from samples, rather than the other way around. All of this means that we can consider inference in the very specific context of our study data without relying on the specific form of a mathematical model. It is a whole new world of inference. The anticipated learning outcomes for this playlist are these: 1. You can describe what makes a statistical method a Monte Carlo method. 2. You can use the Monte Carlo method to evaluate the properties of a statistic or a statistical test. 3. You can use the Monte Carlo method to compare the local relative efficiency of statistical methods. 4. You can use the Monte Carlo method to estimate the p statistic from an exact test. 5. You can use select R packages to conduct inference using permutations and Monte Carlo methods. 6. You can describe how the bootstrap is a special type of Monte Carlo method. 7. You can explain the general method and purpose of bootstrapping. 8. You can distinguish between parametric and nonparametric bootstrap methods. 9. You can use parametric bootstrapping to conduct an inferential test. 10. You can use nonparametric bootstrapping to conduct an inferential test. 11. You can describe how Monte Carlo methods offer a new and flexible approach to inference.