Module description
Syllabus
- Review of basic notions of probability.
- Learning of probability distributions: maximum likelihood and Bayesian learning of Gaussian distributions, conjugate priors, Gaussian mixtures, expectation-maximization approach.
- Learning of input-output relations: linear regression, evidence approximation for optimizing hyperparameters, Gaussian processes.
- Linear classification, Gaussian process classification, Laplace approximation, link to Support Vector Machines, sparsity.
- Graphical models.
- Approximate inference: variational methods, expectation-propagation, sampling methods.
Prerequisites
A good working knowledge of Multivariate Calculus, Probability and Statistics II and Linear Algebra or equivalent.
Assessment details
2 hr written examination
Semester 1 only students will be set an alternative assessment in lieu of in-person exams in January.
Full year students will complete the standard assessment.
Educational aims & objectives
The aim of the module is to introduce key statistical techniques for learning from data, mostly within the framework of Bayesian statistics. The module will cover linear models for regression and classification as well as more advanced approaches including kernel methods, graphical models and approximate inference.
Teaching pattern
Two hours of lectures and one hour of tutorial per week throughout the term
Any problems encountered will be discussed in tutorial sessions. You are strongly encouraged to hand in your solution attempts to the tutor regularly for feedback.
Suggested reading list
Indicative reading list - link to Leganto system where you can search with module code for lists