The specific topics discussed in CSE 250B will include, not necessarily in this order,
January 10 |
Maximum
likelihood (ML), maximum log conditional likelihood |
January 12 |
Logistic regression model,
partial derivatives |
January 17 |
convex minimization,
stochastic gradient (SG) in general |
January 19 |
SG algorithm, cross-validation |
January 24 |
Log-linear
models, feature functions |
January 26 |
Conditional random fields
(CRFs) |
January 31 |
Viterbi algorithm for CRFs |
February 2 |
Gradients for CRFs |
February 7 |
Forward and backward vectors |
February 9 |
Gibbs sampling and
contrastive divergence for CRFs |
February 14 |
Text
mining, bag of words representation, multinomial urn
process |
February 16 |
Multinomial and Dirichlet
distributions, latent Dirichlet allocation (LDA)generative
process |
February 21 |
LDA training via Gibbs
sampling |
February 23 |
Probability equation for
Gibbs sampling |
February 28 |
Representing
meaning, recursive autoencoders (RAEs) for sentences |
March 6 |
RAEs and greedy search for
tree structure |
March 8 |
|
March 13 |
|
March 15 |
|
March 22 |
Thursday
at 11:30am: Final exam and last project report due. |
There is no a priori correspondence between letter grades and numerical scores on the assignments or on the exam. You can evaluate your performance in the class by comparing your scores with the means and standard deviations, which will be announced. However there is also no fixed correspondence between letter grades and standard deviations above or below the mean. If all students do well in the absolute, then all students will get a good grade.
You should not drop CSE 250B just because you are unhappy with the score that you receive on a project or quiz. Instead, you should make an appointment to discuss with the TA or the instructor how you can do better on following projects and quizzes.
Most recently
updated on March 9, 2012 by Charles Elkan, elkan@cs.ucsd.edu.