Difference between revisions of "Smith and Eisner 2008:Dependency parsing by belief propagation"

From Cohen Courses
Jump to navigationJump to search
Line 15: Line 15:
 
== Experimental Result ==
 
== Experimental Result ==
  
== Related papers ==
+
== Related Papers ==  
 +
* [[RelatedPaper::Globerson et al. ICML 2007. Exponentiated Gradient Algorithms for Log Linear Structured Prediction]]
 +
 
 +
* [[RelatedPaper::Berg-Kirkpatrick et al, ACL 2010: Painless Unsupervised Learning with Features]]
 +
 
 +
* [[RelatedPaper::Benajiba and Rosso, LREC 2008]]
 +
 
 +
* [[RelatedPaper::T. Meltzer et al., 2005: Globally Optimal Solutions for Energy Minimization in Stereo Vision using Reweighted Belief Propagation, ICCV 2005]]
 +
 
 +
* [[RelatedPaper::R. Szeliski et al., 2008: A comparative study of energy minimization methods for Markov random fields with smoothness-based priors, IEEE Transactions on Pattern Analysis and Machine Intelligence 2008]]
 +
 
 +
* [[RelatedPaper::T. Ott, and R. Stoop, 2006: The neurodynamics of belief propagation on binary markov random fields, NIPS 2006]]

Revision as of 13:24, 28 September 2011

Citation

Smith, David A. and Jason Eisner (2008). Dependency parsing by belief propagation. Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 145-156, Honolulu, October.

Online version

Smith and Eisner 2008

Summary

This is a crucial paper that presents a loopy Belief Propagation (BP) method for Dependency Parsing, which can also be easily applied to general problems in Named Entity Recognition, Word Alignment, Shallow Parsing, and Constituent Parsing. The paper formulates the dependency parsing problem as a learning and decoding problem on a graphical model with global constraints. The authors show that BP needs only time to perform approximate inference on a graphical model, with second-order features and latent variables incorporated.

Brief Description of the Method

Experimental Result

Related Papers