** Syllabus **

1. * Basic statistics * (1 week)

Common distributions, conditional probability, frequentist versus Bayesian reasoning, maximum likelihood, Bayesian updating, conjugate priors.

2. * More complex models * (2 weeks)

Product distributions, the multivariate Gaussian, mixture models, EM, principal component analysis, factor analysis, hidden Markov models.

3. * Probabilistic networks * (2 weeks)

Conditional independence, semantics of directed and undirected models, equivalence to Markov random fields and Gibbs distributions, converting between models.

4. * Inference * (2 weeks)

Types of queries, basic complexity results, variable elimination, belief propagation, junction tree, variational methods, MCMC methods.

5. * Learning parameters * (2 weeks)

Entropy, axiomatic formulation and AEP, exponential families, maximum entropy principle, information projection, iterative proportional fitting, alternating minimization.

6. * Learning structure * (1 week)

Chow-Liu, basic complexity results, structural EM.

** Course materials **

1. You will need access to Matlab.

2. Useful supplementary material:

Stuart Russell and Peter Norvig, * Artificial intelligence: a modern approach * (second edition).

Thomas Cover and Joy Thomas, * Elements of information theory*.

Judea Pearl, * Probabilistic reasoning in intelligent systems*.

Richard Duda, Peter Hart, and David Stork, * Pattern classification*.

** Prerequisites **

An undergraduate-level background in basic probability, linear algebra, algorithms, and programming is assumed.

** Workload **

The only requirement will be a weekly homework assignment. Late homeworks might not get graded.

This class can be taken for 1, 2 or 4 units; however, the requirements and grading will be the same in all cases.