Computational Learning Theory: 2015-2016
Lecturer | |
Degrees | Schedule B2 (CS&P) — Computer Science and Philosophy Schedule B2 — Computer Science Schedule B2 — Mathematics and Computer Science |
Term | Michaelmas Term 2015 (16 lectures) |
Overview
Machine learning studies automatic methods for identifying patterns in complex data and for making accurate predictions based on past observations. From predicting which movies a customer will like to assigning credit ratings, systems that learn are becoming increasingly widespread and effective. Computational learning theory aims to develop rigourous mathematical foundations for machine learning, in order to provide guarantees about the behaviour of learning algorithms, to identify common methods underlying effective learning procedures, and to understand the inherent difficulty of learning problems. To address such issues we will bring together notions from probability theory, optimisation, online algorithms, and game theory.
Learning outcomes
On completing this course, students should:
- understand key models of supervised and unsupervised learning and be able to formulate specific learning problems in these models;
- understand a variety of learning algorithms and recognize when they are applicable.
Prerequisites
Students should have experience of reading and writing mathematical proofs. Familarity with calculus, probability theory, and linear algebra (to the level of the undergraduate Computer Science degree) is essential.
Synopsis
- Introduction, PAC model, sample complexity for finite sets of hypotheses [3 Lectures]
- The Growth Function, VC dimension, Rademacher complexity, lower bounds [3 Lectures]
- Support Vector Machines, margin theory [3 Lectures]
- Kernels [1 Lecture]
- Online learning: Perceptron and Winnow algorithms [2 lectures]
- Online convex optimisation [1 lecture]
- Weak learning, adaptive boosting, margin bounds [2 Lectures]
Syllabus
Learning consistent hypotheses. PAC learning: sample complexity, VC-dimension, Rademacher complexity. Support vector machines, kernels, margin theory. Online learning: Perceptron, Winnow, online convex optimisation. Weak learning and the Adaboost algorithm.
|
Reading list
Primary Text
Secondary Texts
- Michael Kearns and Umesh Vazirani. An Introduction to Computational Learning Theory, MIT Press, 1995.
Taking our courses
This form is not to be used by students studying for a degree in the Department of Computer Science, or for Visiting Students who are registered for Computer Science courses
Other matriculated University of Oxford students who are interested in taking this, or other, courses in the Department of Computer Science, must complete this online form by 17.00 on Friday of 0th week of term in which the course is taught. Late requests, and requests sent by email, will not be considered. All requests must be approved by the relevant Computer Science departmental committee and can only be submitted using this form.