Difference between revisions of "10-601 GM2"
From Cohen Courses
Jump to navigationJump to search (Created page with " === Slides === [http://curtis.ml.cmu.edu/w/courses/images/d/d9/Lecture19-GMII.pdf Slides in PDF] === Readings === [http://curtis.ml.cmu.edu/w/courses/images/e/e3/Graph.pdf...") |
(No difference)
|
Revision as of 23:26, 3 November 2013
Slides
Readings
Sparse Inverse Covariance Estimation with the Graphical Lasso Estimating Time-Varying Networks
Taking home message
- what is inference and learning in GM
- inference via belief propagation, when such a method is exact? When it is not?
- what is BP, what are the running modes of BP
- why MLE on fully observed GM is easy?
- what if some variables are latent?
- what is the difference between a Markov network and a correlation network?
- Learning a Markov network using Graphical Lasso, which assumptions we are making on the model underlying the data?
- Why neighborhood selection using lasso is correct?