Difference between revisions of "10-601 GM1"

From Cohen Courses
Jump to navigationJump to search
 
(5 intermediate revisions by the same user not shown)
Line 1: Line 1:
  
This a lecture used in the [[Syllabus for Machine Learning 10-601 in Fall 2014]]
+
This a lecture used in the [[Syllabus for Machine Learning 10-601B in Spring 2016]]
  
 
=== Slides ===
 
=== Slides ===
Line 12: Line 12:
  
 
* Or: Chap 8.1 and 8.2.2 (Bishop)
 
* Or: Chap 8.1 and 8.2.2 (Bishop)
 +
* Or: Chap 15 (Russell and Norvig) - disclaimer, my edition is old!
  
=== Taking home message ===
+
=== To remember ===
  
* factorization theorem of BN
+
* Conditional independence and dependence
* Full, independent and intermediate conditional probability models
+
** Notations for these
* Markov blanket
+
* Semantics of a directed graphical model (aka Bayesian network, belief network)
* Learning a BN
+
** Converting a joint probability distribution + conditional independencies to a network
* Inference in BN is NP hard
+
** Converting a network to a joint PDF
* Approximate inference in BN
+
* Determining conditional independencies from the structure of a network
 +
** Blocking
 +
** d-separation

Latest revision as of 16:22, 22 March 2016

This a lecture used in the Syllabus for Machine Learning 10-601B in Spring 2016

Slides

Readings

  • Chapter 6.11 Mitchell
  • Chapter 10 Murphy
  • Or: Chap 8.1 and 8.2.2 (Bishop)
  • Or: Chap 15 (Russell and Norvig) - disclaimer, my edition is old!

To remember

  • Conditional independence and dependence
    • Notations for these
  • Semantics of a directed graphical model (aka Bayesian network, belief network)
    • Converting a joint probability distribution + conditional independencies to a network
    • Converting a network to a joint PDF
  • Determining conditional independencies from the structure of a network
    • Blocking
    • d-separation