Difference between revisions of "Class meeting for 10-605 LDA"

From Cohen Courses
Jump to navigationJump to search
Line 15: Line 15:
  
 
* How Gibbs sampling is used to sample from a model.
 
* How Gibbs sampling is used to sample from a model.
 +
* The "generative story" associated with key models like LDA, naive Bayes, and stochastic block models.
 
* What a "mixed membership" generative model is.
 
* What a "mixed membership" generative model is.
 
* The time complexity and storage requirements of Gibbs sampling for LDAs.
 
* The time complexity and storage requirements of Gibbs sampling for LDAs.
 
* How LDA learning can be sped up using IPM approaches.
 
* How LDA learning can be sped up using IPM approaches.
 +
* How sampling can be sped up by ordering topics so that the most probable topics according for Pr(z|...) are tested first

Revision as of 17:31, 6 December 2015

This is one of the class meetings on the schedule for the course Machine Learning with Large Datasets 10-605 in Fall_2015.

Slides

Quiz: https://qna-app.appspot.com/view.html?aglzfnFuYS1hcHByGQsSDFF1ZXN0aW9uTGlzdBiAgICg2LfLCww

Readings

Things to remember

  • How Gibbs sampling is used to sample from a model.
  • The "generative story" associated with key models like LDA, naive Bayes, and stochastic block models.
  • What a "mixed membership" generative model is.
  • The time complexity and storage requirements of Gibbs sampling for LDAs.
  • How LDA learning can be sped up using IPM approaches.
  • How sampling can be sped up by ordering topics so that the most probable topics according for Pr(z|...) are tested first