Difference between revisions of "10-601 Topic Models"

From Cohen Courses
Jump to navigationJump to search
Line 3: Line 3:
 
=== Slides ===
 
=== Slides ===
  
* [http://www.cs.cmu.edu/~wcohen/10-601/lda+sbm.ppt Slides in PowerPoint],[http://www.cs.cmu.edu/~wcohen/10-601/lda+sbm.pdf Slides in PDF].
+
* [http://www.cs.cmu.edu/~wcohen/10-601/lda.pptx Slides in PowerPoint],[http://www.cs.cmu.edu/~wcohen/10-601/lda.pdf Slides in PDF].
  
 
=== Readings ===
 
=== Readings ===

Revision as of 16:09, 1 April 2016

This a lecture used in the Syllabus for Machine Learning 10-601B in Spring 2016

Slides

Readings

Summary

You should know:

  • what Gibbs sampling is, and how it can be used for inference in a directed graphical model.
  • what the graphical models are which are associated with supervised naive Bayes, unsupervised naive Bayes, PLSI, and LDA.
  • the relationships between PLSI and matrix factorization.
  • how the posterior distribution in a Bayesian model can be used for dimension reduction.