Main aims of Bayesian Statistics. Parametric inference, predictive inference. Observation processes, exchangeability. Univariate parametric models. Monte Carlo approximations. The normal model. Posterior approximation with the Gibbs sampler. The multivariate normal model. Group comparison and hierarchical modeling. Linear regression. Nonconjugate priors and Metropolis-Hastings algorithms. Linear and generalized mixed effect models. Methods for ordinal data. Bayesian networks (time permitting).
Peter D. Hoff A First Course in Bayesian Statistical Methods, 2009 Springer
KNOWLEDGE: Deep understanding of Bayesian inference techniques. Use of parametric models defining parameters as random variablesEXPERTISE: Students will be trained to solve complex problems involving the treatment of missing data, the selection of models among possible competitors, the capability to derive complex conditional distribution.ACHIEVED ABILITIES AT THE END OF THE COURSE: Students will be able to develop and implement a Bayesian statistical model involving simple and hierarchical relations
Preparatory courses: Statistical inference, Probability and mathematics for statistics
Oral lectures and sessions of exercises
Some knowledge of the R software is required
Type of Assessment
There will be a written examination (2/3 of the final mark); homeworks will be also graded (1/3 of the final mark).
Law of total probability and Bayes rule. Bayesian inference: notation. main differences between Bayesian and frequentist inference. Observation processes. Different hypotheses of dependence. Gaussian process, mixture processes: Exchangeability.
Binomial model, HPD regions. predictive inference Conjugate families. No informative priors, Jeffreys priors, Monte Carlo methods, AR and importance sampling. MCMC, Normal multivariate analysis, Regression methods and variables selection. G-priors. Hierarchical models, linear hierarchical models Bayesian generalized linear models. Bayesian networks, definitions, usage and learning (time permitting).