Difference between revisions of "10-601 Topic Models"
From Cohen Courses
Jump to navigationJump to search (→Slides) |
|||
Line 1: | Line 1: | ||
This a lecture used in the [[Syllabus for Machine Learning 10-601B in Spring 2016]] | This a lecture used in the [[Syllabus for Machine Learning 10-601B in Spring 2016]] | ||
+ | |||
+ | Poll: [https://piazza.com/class/ij382zqa2572hc https://piazza.com/class/ij382zqa2572hc] | ||
=== Slides === | === Slides === |
Revision as of 09:42, 4 April 2016
This a lecture used in the Syllabus for Machine Learning 10-601B in Spring 2016
Poll: https://piazza.com/class/ij382zqa2572hc
Slides
Readings
- Muphy 27.1-27.3
- LDA is not covered in Mitchell. There's a nice overview paper on LDA by David Blei. It is also covered in Murphy ch 27.3 (don't read 27.3.6) and 27.4.
- Here's the code I discussed in class and some sample data.
Summary
You should know:
- what Gibbs sampling is, and how it can be used for inference in a directed graphical model.
- what the graphical models are which are associated with supervised naive Bayes, unsupervised naive Bayes, PLSI, and LDA.
- the relationships between PLSI and matrix factorization.
- how the posterior distribution in a Bayesian model can be used for dimension reduction.