Difference between revisions of "10-601 GM1"
From Cohen Courses
Jump to navigationJump to search(4 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
− | This a lecture used in the [[Syllabus for Machine Learning 10- | + | This a lecture used in the [[Syllabus for Machine Learning 10-601B in Spring 2016]] |
=== Slides === | === Slides === | ||
Line 12: | Line 12: | ||
* Or: Chap 8.1 and 8.2.2 (Bishop) | * Or: Chap 8.1 and 8.2.2 (Bishop) | ||
+ | * Or: Chap 15 (Russell and Norvig) - disclaimer, my edition is old! | ||
=== To remember === | === To remember === | ||
* Conditional independence and dependence | * Conditional independence and dependence | ||
+ | ** Notations for these | ||
* Semantics of a directed graphical model (aka Bayesian network, belief network) | * Semantics of a directed graphical model (aka Bayesian network, belief network) | ||
** Converting a joint probability distribution + conditional independencies to a network | ** Converting a joint probability distribution + conditional independencies to a network |
Latest revision as of 16:22, 22 March 2016
This a lecture used in the Syllabus for Machine Learning 10-601B in Spring 2016
Slides
Readings
- Chapter 6.11 Mitchell
- Chapter 10 Murphy
- Or: Chap 8.1 and 8.2.2 (Bishop)
- Or: Chap 15 (Russell and Norvig) - disclaimer, my edition is old!
To remember
- Conditional independence and dependence
- Notations for these
- Semantics of a directed graphical model (aka Bayesian network, belief network)
- Converting a joint probability distribution + conditional independencies to a network
- Converting a network to a joint PDF
- Determining conditional independencies from the structure of a network
- Blocking
- d-separation