Topics in Artificial Intelligence (Learning Theory) - Spring 2008




Course Number: CMSC 35900 (Section 1)
Day/Times: MW 2:40-4:00pm (starting April 7)
Location: TTI-C Conference Room 230
Instructors: Sham Kakade (sham@X) and Ambuj Tewari (tewari@X) [ X = tti-c.org ]

Note: Time and location differ from what is listed in the U of C catalog

Lecture Notes



Lecture
Date
Topics Covered
Lecture  1
April  1
Mistake bound model, Halving algorithm, Linear classifiers and margin
Lecture  2
April  3
Perceptron algorithm, Lower bound for L2-margin, Winnow
Lecture  3
April  7
Winnow (contd.), Online Convex Programming, Online Gradient Descent
Lecture  4 April  9
Exponentiated Gradient Descent, Applications of Online Convex Programming
Lecture  5 April 14 Proof of von Neumann's Minmax Theorem, Weak and Strong Learning, Boosting
Lecture  6 April 16
AdaBoost, L1 Margins and Weak Learning
Lecture  7 April 21
Probabilistic Setup, Loss functions, Empirical Risk Minimization (ERM)
Lecture  8 April 23
Concentration, ERM, Compression Bounds
Lecture  9 April 28
Compression Bounds (contd.), Rademacher averages
Lecture 10 April 30
Massart's Finite Class Lemma, Growth Function
Lecture 11 May  5
VC Dimension, Sauer's Lemma
Lecture 12 May  7
VC Dimension of Multi-layer Neural Networks, Range Queries
Lecture 13 May 12
Online to Batch Conversions
Lecture 13a Supplementary Notes
(Exponentiated) Stochastic Gradient Descent for L1 Constrained Problems
Lecture 14 May 14
Covering Numbers and Rademacher Averages
Lecture 15 May 19
Dudley's Theorem, Pseudodimension, Fat Shattering Dimension, Packing Numbers
Lecture 16 May 21
Fat Shattering Dimension and Covering Numbers
Lecture 17
May 26
Rademacher Composition and Linear Prediction



Homeworks