This course aims to provide students with a knowledge of modern Bayesian Statistical inference, an understanding of the theory and application of stochastic simulation methods including MCMC, and experience of implementing the Bayesian approach in practical situations.
The course will review subjective and frequentist probability, the role of likelihood as a basis for inference, and give a comparative treatment of Bayesian and frequentist approaches. The key concepts in practical Bayesian statistics will be covered including: likelihood formulation; the incorporation of prior knowledge or ignorance in the prior; the interpretation of the posterior distribution as the totality of knowledge and its use in prediction. A range of stochastic simulation methods for investigating posterior distributions will be considered. Methods will include rejection sampling, and Markov chain methods such as the Metropolis-Hastings algorithm and the Gibbs sampler. The use of stochastic methods for inference for partially observed processes will be discussed and students will gain experience of implementing methods in computer laboratory sessions. The course will further consider the use of computational methods, especially simulation, in probability and statistics
- Statistical programming. This will include an introduction to the use of R (and/or, potentially, other languages and packages) for probabilistic and statistical calculations, including the use of built-in simulation capabilities, iterative procedures, solution of equations and maximisation of functions.
- Philosophy of Bayesian inference. This will include treatment of subjective and frequentist probability; the role of likelihood as a basis for inference; comparative treatment of Bayesian and frequentist approaches.
- Implementing the Bayesian approach. This will include: the formulation of likelihood for a range of statistical models and sampling designs; the incorporation of prior knowledge through prior density selection; conjugacy; the use of non-informative and non-subjective priors (including Jeffrey’s prior); the interpretation of the posterior distribution as the totality of knowledge; predictive distributions.
- Theory of stochastic processes. Markov chains, classification of states, irreducibility, aperiodicity etc., stationary distributions, generalised and detailed balance, convergence.
- Markov-chain and other stochastic methods for investigating target distributions. Ideas covered will include: simple simulation methods using transformations, distribution function inversion and acceptance-rejection sampling; construction of MCMC methods using standard recipes – Metropolis (and Metropolis-Hastings) algorithm, Gibb’s sampler, implementation of methods using the R computing package; investigation of properties through simulation.
- Application of MCMC methods in Bayesian inference Ideas covered will include: formulation of samplers for inferential problems in e.g. pattern recognition, signal classification, population dynamics; implementation of methods using R; application to problems involving missing data; informative methods of summarising posterior densities.
W. M. Bolstad, Introduction to Bayesian statistics (John Wiley, 2007).
A. Gelman, J. B. Carlin, H. S. Stern, D. B. Dunson, A. Vehtari, & D. B. Rubin, Bayesian Data Analysis (3rd Edition) (Chapman & Hall, 2014)
G. Cassella & R. L. Berger, Statistical Inference (2nd Edition) (Duxbury 2002)
S.M. Ross, A course in simulation (Macmillan, 1990)
The course will be assessed through a 2-hour written exam (70%), a mid-term test and an assessed project (15% each).
SCQF Level: 9.
Help: If you have any problems or questions regarding the course, you are encouraged to contact the lecturer
VISION: further information and course materials are available on VISION