Difference between revisions of "10-601 Topic Models"

From Cohen Courses
Jump to navigationJump to search
 
(7 intermediate revisions by the same user not shown)
Line 1: Line 1:
This a lecture used in the [[Syllabus for Machine Learning 10-601 in Fall 2014]]
+
This a lecture used in the [[Syllabus for Machine Learning 10-601B in Spring 2016]]
 +
 
 +
Poll: [https://piazza.com/class/ij382zqa2572hc https://piazza.com/class/ij382zqa2572hc]
  
 
=== Slides ===
 
=== Slides ===
  
* [http://www.cs.cmu.edu/~wcohen/10-601/lda+sbm.ppt Slides in PowerPoint],[http://www.cs.cmu.edu/~wcohen/10-601/lda+sbm.pdf Slides in PDF].
+
* [http://www.cs.cmu.edu/~wcohen/10-601/lda.ppt Slides in PowerPoint],[http://www.cs.cmu.edu/~wcohen/10-601/lda.pdf Slides in PDF].
  
 
=== Readings ===
 
=== Readings ===
  
* LDA is not covered in Mitchell.  There's a nice [http://www.cs.princeton.edu/~blei/papers/Blei2012.pdf overview paper on LDA] by David Blei.
+
* Murphy ch 27.3 (don't read 27.3.6) and 27.4.
 +
* LDA is not covered in Mitchell.  There's a nice [http://www.cs.princeton.edu/~blei/papers/Blei2012.pdf overview paper on LDA] by David Blei.
  
 
* Here's the [http://www.cs.cmu.edu/~wcohen/10-601/lda-demo code I discussed in class] and some sample data.
 
* Here's the [http://www.cs.cmu.edu/~wcohen/10-601/lda-demo code I discussed in class] and some sample data.
 +
* [https://en.wikipedia.org/wiki/Dirichlet-multinomial_distribution The Dirichlet-multinomial page on wikipedia] has a good discussion of collapsed Gibbs sampling.
  
 
===  Summary  ===
 
===  Summary  ===
Line 16: Line 20:
  
 
* what Gibbs sampling is, and how it can be used for inference in a directed graphical model.
 
* what Gibbs sampling is, and how it can be used for inference in a directed graphical model.
* what the graphical models are which are associated with supervised naive Bayes, unsupervised naive Bayes, PLSI, and LDA.
+
* what graphical models are associated with supervised naive Bayes, unsupervised naive Bayes, and LDA.
* the relationships between PLSI and matrix factorization.
 
* how the posterior distribution in a Bayesian model can be used for dimension reduction.
 

Latest revision as of 16:35, 4 April 2016

This a lecture used in the Syllabus for Machine Learning 10-601B in Spring 2016

Poll: https://piazza.com/class/ij382zqa2572hc

Slides

Readings

  • Murphy ch 27.3 (don't read 27.3.6) and 27.4.
  • LDA is not covered in Mitchell. There's a nice overview paper on LDA by David Blei.

Summary

You should know:

  • what Gibbs sampling is, and how it can be used for inference in a directed graphical model.
  • what graphical models are associated with supervised naive Bayes, unsupervised naive Bayes, and LDA.