Difference between revisions of "10-601 GM2"

From Cohen Courses
Jump to navigationJump to search
 
(3 intermediate revisions by the same user not shown)
Line 1: Line 1:
 +
 +
This a lecture used in the [[Syllabus for Machine Learning 10-601B in Spring 2016]]
  
 
=== Slides ===
 
=== Slides ===
  
[http://curtis.ml.cmu.edu/w/courses/images/d/d9/Lecture19-GMII.pdf Slides in PDF]
+
* [http://www.cs.cmu.edu/~wcohen/10-601/networks-2.pptx Slides in PPT], [http://www.cs.cmu.edu/~wcohen/10-601/networks-2.pdf Slides in PDF].
  
 
=== Readings ===
 
=== Readings ===
  
* [http://curtis.ml.cmu.edu/w/courses/images/e/e3/Graph.pdf Sparse Inverse Covariance Estimation with the Graphical Lasso]
+
* See [[10-601 GM1|previous lecture]]
* [http://www.cs.cmu.edu/~epxing/papers/2010/kolar_song_xing_aoas10.pdf Estimating Time-Varying Networks]
+
 
 
 
=== To remember ===
 
=== To remember ===
  

Latest revision as of 10:52, 31 March 2016

This a lecture used in the Syllabus for Machine Learning 10-601B in Spring 2016

Slides

Readings

To remember

  • what is inference in DGMs
  • the general outline of the BP algorithm for polytrees
  • what is a polytree and when is BP exact
    • what "message passing" means
  • what a Markov blanket is
  • what a Markov network (undirected model) is
  • how node can be merged to create a polytree
  • the advantages and disadvantages of BP on polytrees and loopy BP