I. Prediction problems
A taxonomy of prediction tasks
II. Generative models
The generative approach to classification
Gaussian generative models
III. Linear prediction
Perceptron and support vector machines
Multiclass classification and structured output prediction
IV. Combining simple classifiers
Bagging and random forests
V. Representation learning
Linear projections: PCA and SVD
Embeddings and manifold learning
VI. Deep learning
Chris: Mon 4-5p in Pepper Canyon 122
Shradha: Thu 7-8p in Center 212
1. Ability to write simple programs in Python: functions, control structures, string handling, arrays and dictionaries.
2. Familiarity with basic probability, at the level of CSE 21 or CSE 103.
3. Familiarity with basic linear algebra, at the level of Math 18 or Math 20F.
1. Programming exercises should be done in Python. I recommend using Jupyter notebooks.
2. There is no required text for the course. But here are some useful references. The first is available as an e-book through the library website; the rest are on reserve at Geisel:
Trevor Hastie, Robert Tibshirani, and Jerome Friedman, The elements of statistical learning (2nd edition).
Gilbert Strang. Linear algebra and its applications .
Kevin Murphy, Machine learning: a probabilistic perspective.
Richard Duda, Peter Hart, and David Stork, Pattern classification (2nd edition).
There will be weekly homeworks, to be turned in (typed and in PDF format) on Gradescope. These will be a mix of mathematical exercises and programming projects.
No late homeworks will be accepted; however, the lowest homework score will be dropped.
There will be five in-class quizzes.
Homeworks: 50% (lowest score will be dropped)
Quizzes: 10% each