### Overview

The goal of this class is to introduce you to the theoretical foundations of machine learning, including learning models and generalization bounds. We will cover the following topics:
- The PAC model and generalization bounds
- Uniform convergence bounds, VC dimension
- Online learning, the mistake bound model, perceptron and winnow
- Learning with expert advice, multi-armed bandits
- Weak and Strong learning, Boosting

### Prerequisite

A prerequisite for this class is basic knowledge of probability and some previous exposure to machine learning.
### Textbook

There is no textbook for this class. Some suggested references are:
- M. Kearns and U. Vazirani,
* Introduction to Computational Learning Theory*, MIT Press, 1997.
- L. Devroye, A. Gyorfi, G. Lugosi,
* A probabilistic theory of pattern recognition *, Springer.

I may hand out specific readings in class. The readings will be either based on my own notes, or chapters from the upcoming book on learning theory by Shai Ben-David and Shai Shalev-Shwartz.