Principles of Statistics(D)
PRINCIPLES OF STATISTICS (D)
Part IB Statistics is essential
The Likelihood Principle
- Basic inferential principles. Likelihood and score functions, Fisher information, Cramer–Rao lower bound, review of multivariate normal distribution. Maximum likelihood estimators and their asymptotic properties: stochastic convergence concepts, consistency, efficiency, asymptotic normality. Wald, score and likelihood ratio tests, confidence sets, Wilks’ theorem, profile likelihood. Examples. [8]
Bayesian Inference
- Prior and posterior distributions. Conjugate families, improper priors, predictive distributions. Asymptotic theory for posterior distributions. Point estimation, credible regions, hypothesis testing and Bayes factors. [3]
Decision Theory
- Basic elements of a decision problem, including loss and risk functions. Decision rules, admissibility, minimax and Bayes rules. Finite decision problems, risk set. Stein estimator. [4]
Multivariate Analysis
- Correlation coefficient and distribution of its sample version in a bivariate normal population. Partial correlation coefficients. Classification problems, linear discriminant analysis. Principal component analysis. [5]
Nonparametric Inference and Monte Carlo Techniques
- Glivenko–Cantelli theorem, Kolmogorov–Smirnov tests and confidence bands. Bootstrap methods: jackknife, roots (pivots), parametric and nonparametric bootstrap. Monte Carlo simulation and the Gibbs sampler. [4]
Appropriate books
- G. Casella and R.L. Berger Statistical Inference. Duxbury (2001)
- A.W. van der Vaart Asymptotic Statistics. CUP (1998)
- T.W. Anderson An introduction to multivariate statistical analysis. Wiley (2003)
- E.L. Lehmann and G Casell Theory of Point estimation. Springer (1998)
Associated GitHub Page
https://jaircambridge.github.io/Principles-of-Statistics-D/