# Part III - Modern Statistical Methods

## Lectured by R. D. Shah, Michaelmas 2017

These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures. They are nowhere near accurate representations of what was actually lectured, and in particular, all errors are almost surely mine.

This is an HTML version of the notes, generated using some horribly-written scripts and pdf2htmlEX. These are not guaranteed to display well, but do let me know if something is broken. Note however that I cannot help you if your browser does not support standard HTML features (eg. this part is known not to work well with w3m). You can either view all sections in a single page (Full version), or access individual sections below. If you want to download a pdf, head to the Notes page.

# Contents

- V Full version
- 1 Introduction
- 2 Classical statistics
- 3 Kernel machines
- 3.1 Ridge regression
- 3.2 v-fold cross-validation
- 3.3 The kernel trick
- 3.4 Making predictions
- 3.5 Other kernel machines
- 3.6 Large-scale kernel machines
- 4 The Lasso and beyond
- 4.1 The Lasso estimator
- 4.2 Basic concentration inequalities
- 4.3 Convex analysis and optimization theory
- 4.4 Properties of Lasso solutions
- 4.5 Variable selection
- 4.6 Computation of Lasso solutions
- 4.7 Extensions of the Lasso
- 5 Graphical modelling
- 5.1 Conditional independence graphs
- 5.2 Gaussian graphical models
- 5.3 The graphical Lasso
- 5.4 Structural equation modelling
- 5.5 Causal structure learning
- 5.6 The PC algorithm
- 6 High-dimensional inference