Difference between revisions of "10-601 HMMs"

From Cohen Courses
Jump to navigationJump to search
(Created page with "=== Slides === TBD === Readings === [http://curtis.ml.cmu.edu/w/courses/images/8/89/GM-jordan.pdf Graphical Models by Michael I. Jordan] === Taking home message === * fac...")
 
 
(2 intermediate revisions by the same user not shown)
Line 1: Line 1:
 +
This a lecture used in the [[Syllabus for Machine Learning 10-601 in Fall 2014]]
 +
 
=== Slides ===
 
=== Slides ===
  
TBD
+
* Ziv's lecture: [http://www.cs.cmu.edu/~zivbj/classF14/HMMLearning.pdf Slides in pdf].
  
 
=== Readings ===
 
=== Readings ===
  
[http://curtis.ml.cmu.edu/w/courses/images/8/89/GM-jordan.pdf Graphical Models by Michael I. Jordan]
+
* [http://curtis.ml.cmu.edu/w/courses/images/8/89/GM-jordan.pdf Graphical Models by Michael I. Jordan]
  
 
=== Taking home message ===
 
=== Taking home message ===
  
* factorization theorem of BN
+
* Learning with known states: MLE
* Full, independent and intermediate conditional probability models
+
* Learning with unknown states: EM
* Markov blanket
+
* Analyzing protein sequences with HMMs
* Learning a BN
 
* Inference in BN is NP hard
 
* Approximate inference in BN
 

Latest revision as of 10:48, 6 November 2014

This a lecture used in the Syllabus for Machine Learning 10-601 in Fall 2014

Slides

Readings

Taking home message

  • Learning with known states: MLE
  • Learning with unknown states: EM
  • Analyzing protein sequences with HMMs