Difference between revisions of "Expectation Maximization"
From Cohen Courses
Jump to navigationJump to searchLine 1: | Line 1: | ||
− | + | Expectation Maximization is a technique to infer the parameters of a statistical model and the underlying intuition is that an optimal parametrized statistical model will give a highest probability to the training data on which it is trained. It consists of the expectation or E- step in which the log likelihood probability is calculated based on the current estimate of the parameters and the hidden (latent) variables and a maximization or M-step in which the value of the parameters is updated to increase the maximum value of the likelihood function. EM can be used to discover a local maxima for the log-likelihood function. | |
− | |||
− | |||
[http://en.wikipedia.org/wiki/Expectation-maximization_algorithm External link] | [http://en.wikipedia.org/wiki/Expectation-maximization_algorithm External link] |
Revision as of 00:58, 1 April 2011
Expectation Maximization is a technique to infer the parameters of a statistical model and the underlying intuition is that an optimal parametrized statistical model will give a highest probability to the training data on which it is trained. It consists of the expectation or E- step in which the log likelihood probability is calculated based on the current estimate of the parameters and the hidden (latent) variables and a maximization or M-step in which the value of the parameters is updated to increase the maximum value of the likelihood function. EM can be used to discover a local maxima for the log-likelihood function.