Difference between revisions of "10-601 HMMs"

From Cohen Courses
Jump to navigationJump to search
(Created page with "=== Slides === TBD === Readings === [http://curtis.ml.cmu.edu/w/courses/images/8/89/GM-jordan.pdf Graphical Models by Michael I. Jordan] === Taking home message === * fac...")
 
Line 1: Line 1:
 +
This a lecture used in the [[Syllabus for Machine Learning 10-601 in Fall 2014]]
 +
 
=== Slides ===
 
=== Slides ===
  
TBD
+
[http://curtis.ml.cmu.edu/w/courses/images/9/9e/Lecture18-GM.pdf Slides in PDF]
  
 
=== Readings ===
 
=== Readings ===
  
[http://curtis.ml.cmu.edu/w/courses/images/8/89/GM-jordan.pdf Graphical Models by Michael I. Jordan]
+
* [http://curtis.ml.cmu.edu/w/courses/images/8/89/GM-jordan.pdf Graphical Models by Michael I. Jordan]
  
 
=== Taking home message ===
 
=== Taking home message ===
  
* factorization theorem of BN
+
* Why BNs are not enough
* Full, independent and intermediate conditional probability models
+
* The layers of HMMs
* Markov blanket
+
* Formal definition of a HMM model
* Learning a BN
+
* Inference with observations
* Inference in BN is NP hard
+
* Computing the most likely path
* Approximate inference in BN
 

Revision as of 09:36, 12 August 2014

This a lecture used in the Syllabus for Machine Learning 10-601 in Fall 2014

Slides

Slides in PDF

Readings

Taking home message

  • Why BNs are not enough
  • The layers of HMMs
  • Formal definition of a HMM model
  • Inference with observations
  • Computing the most likely path