CSE 291: Bayesian methods
Time
TuTh 11-12.30 in CSE 2154
Instructor:
Sanjoy Dasgupta
Office hours Tue 2-4 in CSE 4138
Administrative details
Course requirements: There will be periodic homework assignments as well as a final project.
The following textbooks contain a lot of the material we'll be covering:
- Gelman, Carlin, Stern, Rubin. Bayesian Data Analysis.
- Murphy. Machine Learning: A Probabilistic Perspective.
- Barber. Bayesian Reasoning and Machine Learning.
Lecture schedule, homework assignments, and optional accompanying readings
- Course outline (Jan 8)
- Entropy, exponential families, and maximum likelihood (Jan 10,15,17,22)
- A nice introduction to maximum entropy modeling is:
Berger, Della Pietra, Della Pietra. A maximum entropy approach to natural language processing.
- For entropy and asymptotic equipartition, consult Chapters 2 and 3 of the following fantastic text:
Cover and Thomas. Elements of Information Theory.
- The axiomatic formulation of entropy that I presented is one of many, but my personal favorite. It is from:
Aczel, Forte, Ng. Why Shannon and Hartley entropies are "natural". (Find it in JSTOR, or email me)
- Here's the paper for the species distribution problem we discussed:
Phillips, Dudik, Schapire. A maximum entropy approach to species distribution modeling.
Homework 1, due 1/31.
- Bayesian inference for exponential families (Jan 24,29,31)
Homework 2, due 2/12.
- Gaussian models: conditioning, linear regression, kernel trick, Bayesian model selection, Gaussian processes (Feb 5,7,19)
- A good basic reference on Gaussian processes is the following book, available online:
Rasmussen and Williams. Gaussian processes for Machine Learning.
- For the mathematically inclined:
Adler and Taylor. Random Fields and Geometry.
- Markov random fields: the Hammersley-Clifford theorem, Gibbs sampling, and MAP inference (Feb 21,26,28)
- Mixture models and Dirichlet processes (Mar 5,7)
- Topic models (Mar 12,14)
Projects are due Monday March 18.