Difference between revisions of "Forward-Backward"

From Cohen Courses
Jump to navigationJump to search
Line 1: Line 1:
 
== Summary ==
 
== Summary ==
  
This is a dynamic programming [[Category::method | algorithm]], used in [[AddressesProblem::Hidden Markov Models]] to efficiently compute the posterior marginals over all the hidden state variables. This work extends [[IBM Model 1]] and [[IBM Model 2]], which models lexical translation probabilities and absolute distortion probabilities, by also modeling relative distortion.
+
This is a dynamic programming [[Category::method | algorithm]], used in [[AddressesProblem::Hidden Markov Models]] to efficiently compute the posterior marginals over all the hidden state variables.
 
 
The relative distortion is modeled by applying a first-order [[UsesMethod::Hidden Markov Model]], where each alignment probabilities are dependent on the distortion of the previous alignment.
 
 
 
Results indicate that Modeling the relative distortion can improve the overall quality of the Word Alignments.
 

Revision as of 14:56, 28 September 2011

Summary

This is a dynamic programming algorithm, used in Hidden Markov Models to efficiently compute the posterior marginals over all the hidden state variables.