Difference between revisions of "Belief Propagation"
From Cohen Courses
Jump to navigationJump to searchLine 1: | Line 1: | ||
− | This is a [[ | + | This is a [[Category::Method]] proposed by [[RelatedPaper::Judea Pearl, 1982: Reverend Bayes on inference engines: A distributed hierarchical approach, AAAI 1982]]. |
Belief Propagation is a message passing inference method for statistical graphical models (e.g. Bayesian networks and Markov random fields). The basic idea is to compute the marginal distribution of unobserved nodes, based on the conditional distribution of observed nodes. There are two major cases: | Belief Propagation is a message passing inference method for statistical graphical models (e.g. Bayesian networks and Markov random fields). The basic idea is to compute the marginal distribution of unobserved nodes, based on the conditional distribution of observed nodes. There are two major cases: | ||
Line 6: | Line 6: | ||
== Definition == | == Definition == | ||
+ | In a generalized Markov random fields, the | ||
+ | |||
+ | == Marginals vs. Joint Maximizer == | ||
+ | The idea of computing marginals can be expressed as: | ||
+ | |||
+ | where as to compute joint maximum likelihood, we need: | ||
+ | : <math>\underset{x}{\operatorname{argmax}}\ P(x_1,x_2,x_3,...,x_n).</math> | ||
+ | |||
+ | |||
== Inference == | == Inference == |
Revision as of 23:28, 26 September 2011
This is a Method proposed by Judea Pearl, 1982: Reverend Bayes on inference engines: A distributed hierarchical approach, AAAI 1982.
Belief Propagation is a message passing inference method for statistical graphical models (e.g. Bayesian networks and Markov random fields). The basic idea is to compute the marginal distribution of unobserved nodes, based on the conditional distribution of observed nodes. There are two major cases:
- When the graphical model is both a factor graph and a tree (no loops), the exact marginals can be obtained. This is also equivalent to dynamic programming and Viterbi.
- Otherwise, loopy Belief Propagation will become an approximation inference algorithm.
Definition
In a generalized Markov random fields, the
Marginals vs. Joint Maximizer
The idea of computing marginals can be expressed as:
where as to compute joint maximum likelihood, we need: