SC4/SM8 Advanced Topics in Statistical Machine Learning
Term: Hilary Term 2019, Jan 14 - Mar 8
Lectures: weeks 1-6,8: Tue 3pm & Thu 4pm, LG.01; week 7: Thu 4pm & Fri 4pm, LG.01 (note the time change in week 7)
Part C / OMMS Classes Class Sign-Up (via Weblearn)
Set 1: Tue 9am (3,5,7,TT1), LG.05 class tutor: Jean-Francois Ton, TA: Charline Le Lan
Set 2: Tue 10.30am (3,5,7,TT1), LG.05 class tutor: Jean-Francois Ton, TA: Charline Le Lan
Set 3: Tue 9am (3,5,7,TT1), LG.04 class tutor: Tomas Vaskevicius, TA: Tomas Vaskevicius
Part C Problem Sheet Deadlines: Fri noon, weeks 2,4,6 (Sheets 1-3, no hand-in for Sheet 4)
MSc Classes: Fri 10am (3,5,7,TT1), LG.01
Part C Revision Classes: Tue 2pm, TT week 3, SC4 2017 exam
Tue 2pm, TT week 5, SC4 2018 exam
Part C Consultation Sessions (with class tutors): Mon 10am, TT week 3, LG.05
Mon 10am, TT week 5, LG.05


Course Materials

The course materials will appear here before the course starts. They consist of notes, slides, and Jupyter notebooks. Notes may not be exhaustive and should be used in conjunction with the slides. All materials may be updated during the course and are thus best read on screen. Please email me any typos or corrections.

Lecture Notes


Problem Sheets


Aims and objectives:

Machine learning is widely used across many scientific and engineering disciplines to construct methods for finding interesting patterns in large datasets, devising complex models and prediction tools. This course introduces several widely used machine learning techniques and describes their underpinning statistical principles and properties. The course studies both unsupervised and supervised learning and several advanced topics are covered in detail, including some state-of-the-art machine learning techniques. The course will also cover computational considerations of machine learning algorithms and how they can scale to large datasets.


A8 Probability and A9 Statistics.
Some material from this year's syllabus of SB2.2 Statistical Machine Learning, PCA and the basics of clustering, will be used (which is mainly taught in the first three lectures of SB2.2, also in HT2019), but SB2.2 is not a prerequisite and background notes will be provided.


Review of unsupervised and supervised learning.
Duality in convex optimization and support vector machines.
Kernel methods and reproducing kernel Hilbert spaces. Representer theorem. Representation of probabilities in RKHS.
Kernel PCA. Spectral clustering. Manifold regularization.
Probabilistic and Bayesian machine learning: latent variable models, variational free energy, EM algorithm, mixtures, probabilistic PCA.
Laplace Approximation. Variational Bayes, Latent Dirichlet Allocation.
Collaborative filtering models, probabilistic matrix factorization.
Gaussian processes for regression and classification. Bayesian optimization.

Textbooks and Background Reading

Background Review Aids:




Knowledge of Python is not required for this course, but some descriptive examples in lectures may be done in Python. Students interested in further Python training are referred to the free University IT online courses.