Difference between revisions of "10-601 GM1"

From Cohen Courses
Jump to navigationJump to search
Line 13: Line 13:
 
* Or: Chap 8.1 and 8.2.2 (Bishop)
 
* Or: Chap 8.1 and 8.2.2 (Bishop)
  
=== Taking home message ===
+
=== To remember ===
  
* factorization theorem of BN
+
* Conditional independence and dependence
* Full, independent and intermediate conditional probability models
+
* Semantics of a directed graphical model (aka Bayesian network, belief network)
* Markov blanket
+
** Converting a joint probability distribution + conditional independencies to a network
* Learning a BN
+
** Converting a network to a joint PDF
* Inference in BN is NP hard
+
* Determining conditional independencies from the structure of a network
* Approximate inference in BN
+
** Blocking
 +
** d-separation

Revision as of 10:15, 21 March 2016

This a lecture used in the Syllabus for Machine Learning 10-601 in Fall 2014

Slides

Readings

  • Chapter 6.11 Mitchell
  • Chapter 10 Murphy
  • Or: Chap 8.1 and 8.2.2 (Bishop)

To remember

  • Conditional independence and dependence
  • Semantics of a directed graphical model (aka Bayesian network, belief network)
    • Converting a joint probability distribution + conditional independencies to a network
    • Converting a network to a joint PDF
  • Determining conditional independencies from the structure of a network
    • Blocking
    • d-separation