Course Syllabus

Follow the link for each class to find a detailed description, suggested readings, and class slides.

Class Title Instructor(s)
Class 01 The Course at a Glance TP
Class 02 Statistical Learning Setting LR
Class 03 Regularized Least Squares LR
Class 04 Features and Kernels LR
Class 05 Logistic Regression and Support Vector Machines LR
Class 06 Learning with Stochastic Gradients AR
Class 07 Iterative Regularization via Early Stopping LR
Class 08 Large Scale Learning by Sketching LR
Class 09 Sparsity Based Regularization LR
Class 10 Neural networks: Introduction, backpropagation LR
Class 11 Neural Networks: tips, tricks and software AB
Class 12 Generative Adversarial Networks PI
Class 13 Statistical Learning I AR
Class 14 Statistical Learning II AR
Class 15 ERM, Uniform Convergence AR
Class 16 Sample Complexity via Rademacher Averages AR
Class 17 Margin Analysis for Classification AR
Class 18 Local Methods AR
Class 19 Sample Compression, Stability AR
Class 20 Privacy and Information-Theoretic Stability AR
Class 21 Deep Learning Theory: Approximation TP
Class 22 Online Prediction AR
Class 23 Sample complexity of Neural Networks AR
Class 24 Deep Learning Theory: Optimization TP
Class 25 Deep Learning Theory: Generalization I TP
Class 26 Deep Learning Theory: Generalization II TP
Class 27 Project reports due  


Reading List

Notes covering the classes are provided in the form of independent chapters of a book currently in draft format. Additional information id given through the slides associated with classes (where applicable). The books/papers listed below are useful general reference reading, especially from the theoretical viewpoint. A list of additional suggested readings are also provided separately for each class.

Book (draft)

  • L. Rosasco and T. Poggio, Machine Learning: a Regularization Approach, MIT-9.520 Lectures Notes, Manuscript, Dec. 2017 (provided).

Primary References

Resources and links