TTIC 31120: Computational and Statistical Learning Theory

This is a web page for the Fall 2016: course "Computational and Statistical Learning Theory", taught at TTIC, and also open to all University of Chicago students.

Tuesdays and Thursdays 10:30-11:50AM in TTIC 530
Instructor: Nati Srebro.
TA: Jialei Wang, Email: jialei@uchicago.edu
Office hours: Fridays 1:40-3:00PM, TTIC Library

Course Description

The purpose of this course is to gain a deeper understanding of machine learning by formalizing learning mathematically, studying both statistical and computational aspects of learning, and understanding how these two aspects are inseparable. The course is intended both for students interested in using machine learning methods and that would like to understand such methods better so as to use them more effectively, as well as for students interested in the mathematical aspects of learning or that intend on rigorously studying or developing learning algorithms.

We will discuss classic results and recent advances in statistical learning theory (mostly under the agnostic PAC model), touch on computational learning theory, and also explore the relationship with stochastic optimization and online regret analysis. Our emphasis will be on concept development and on obtaining a rigorous quantitative understanding of machine learning. We will also study techniques for analyzing and proving performance guarantees for learning methods.

Prerequisites

Familiarity with Convex Optimization, Computational Complexity and background in Statistics can be helpful, but is not required.

Specific Topics:

We will try to touch:

Requirements and Grading:

Recommended Texts

We will not be following a specific text, but some of the material is covered in:

Schedule and Lectures

Week of Tuesday Thusday
September 27th What is Learning? PAC Learning and VC Theory I
October 3rd PAC Learning and VC Theory II MDL and PAC-Bayes
October 10th Computational Complexity of Learning Proper vs Improper Learning
October 17th Agnostic Learning Boosting and Compression Schemes
October 24th Real-Valued Loss Scale-Sensitive Classes
October 31th SVMs, L1 Regularization, Boosting Regularized Learning, Stability
November 7th Online Learning FTRL,OGD
November 14th Online Dual Averaging, Mirror Descent Online to Batch, Stochastic Optimization
November 21th Optimistic Rate, KNN No Class
November 28th Neural Networks, Course Summary No Class

Assignments:

Problem Set 1 (due October 10th)
Problem Set 2 (due October 24th)
Problem Set 3 (due November 7th)
Problem Set 4 (due December 2nd)

Sample Final Exam

Sample final exam