CSE 250A. Principles of Artificial Intelligence:
There will be a quiz in class every Tuesday starting on
October 9, 2012. The last two quizzes will be on
Thursdays, on November 29 and December 6. In total there
will be 9 quizzes. The lowest quiz grade will be
discarded, so one quiz may be missed without penalty.
There will be eight homework assignments. The first six
will be handed out on Thursdays, starting on October 4,
and due back at the start of lecture the following
Thursday. Quizzes and assignments will both be done in
pairs; please pay attention to detailed instructions.
Please ask questions using Piazza.
Methods based on probability theory for reasoning and learning under uncertainty. Topics will include directed and undirected probabilistic graphical models, exact and approximate inference, latent variables, expectation-maximization, hidden Markov models, Markov decision processes, applications to vision, robotics, speech, and/or text.
The course is aimed primarily at first-year graduate students in mathematics, science, and engineering. Prerequisites are elementary probability, multivariable calculus, linear algebra, and basic programming ability in a high-level language such as C, Java, R, or Matlab. Programming assignments are completed in the language of the student's choice.
covers some of the same material as 250A, but at a slower
pace and less advanced mathematical level. The homework
assignments in CSE 250A are longer and more challenging.
is at the same level as 250A, but has different content
and style. Students may take either or both of 250A and
250B, in any order.
Lecture notes will be linked to the table below, as PDF
files. The notes for one day may be in the PDF file for an
||Overview of the course, intro
to Bayesian networks
||Laws of probability theory, Bayesian
network for the earthquake scenario
||First homework assignment
|October 8 section notes
||Simpson's paradox, explaining away,
formal definition of a Bayesian network
||Conditional probability tables,
logistic regression, d-separation
||Second homework assignment
|October 15 section notes
||Independence as absence of
information flow, examples of d-separation
||Algorithm for computing p(X|E)
in polytree networks
|October 22 section notes coming soon
||Merging nodes and cutset conditioning
for inference in loopy networks
|October 24 section notes on
inference via stochastic sampling
of maximum likelihood (ML). ML learning of
parameters for a Bayesian network.
|October 26 section notes
||Markov models of sentences, linear
regression viewed as a Bayesian network.
||EM for Bayesian networks.
Context-based language model.
||21st century data analysis and the
election. EM for context-based language models and
for mixture models.
|Section notes on EM
and mixture models for Friday November 9 and
Monday November 12
||Forward algorithm and Viterbi
due on November 27
|Section notes on HMM algorithms for
||EM training of HMMs. Linear dynamical
||Thanksgiving: No class.|
||Last assignment, due on
||Policy iteration, restricted linear
|Section notes on MDPs and RL for Nov.
||Approximate policy evaluation.
||Least-squares policy iteration.
||Wednesday from 11:30am to 2:30pm: Final exam.|