TTIC 31120: Computational and Statistical Learning Theory

This is a web page for the Spring 2015 course "Computational and Statistical Learning Theory", taught at TTIC, and also open to all University of Chicago students.

MWF 10:30-11:50AM in TTIC 530 (some of the classes will be canceled. Please see the schedule at the bottom of the page.)
Instructor: Nati Srebro.
TA: Behnam Neyshabur.

Course Description

The purpose of this course is to gain a deeper understanding of machine learning by formalizing learning mathematically, studying both statistical and computational aspects of learning, and understanding how these two aspects are inseparable. The course is intended both for students interested in using machine learning methods and that would like to understand such methods better so as to use them more effectively, as well as for students interested in the mathematical aspects of learning or that intend on rigorously studying or developing learning algorithms.

We will discuss classic results and recent advances in statistical learning theory (mostly under the agnostic PAC model), touch on computational learning theory, and also explore the relationship with stochastic optimization and online regret analysis. Our emphasis will be on concept development and on obtaining a rigorous quantitative understanding of machine learning. We will also study techniques for analyzing and proving performance guarantees for learning methods.

Prerequisites

Familiarity with Convex Optimization, Computational Complexity and background in Statistics can be helpful, but is not required.

Specific Topics:

We will try to touch:

Requirements and Grading:

Recommended Texts

We will not be following a specific text, but some of the material is covered in:

Schedule and Lectures

Week of Monday Wednesday Friday
March 30th What is Learning?* PAC Learning and VC Theory I No class
April 6th PAC Learning and VC Theory II MDL and PAC-Bayes No class
April 13th No class Computational Complexity of Learning Proper vs Improper Learning
April 20th No class Hardness of Improper Learning
Agnostic Learning
Boosting
Compression Schemes
April 27th No class Real-Valued Loss Scale-Sensitive Classes
May 4th No class SVMs, L1 Regularization No class
May 11th No class L1 margin, Stability Online Learning
May 18th Online Gradient Descent No class Strong Convexity
Online Dual Averaging
May 25th No class Mirror Descent
Online to Batch
No class
June 1st Stochastic Optimization
Neural Networks
Course Summary
Nearest Neighbor Classificaiton
No class
June 8th Final Exam: Tuesday June 9, 10:30am

*The shaded slides are not required material.

Assignments:

Problem Set 1 (due April 15th)
Problem Set 2 (due May 1st)
Problem Set 3+4 (due June 5th)