Difference between revisions of "Smith and Eisner 2005:Contrastive Estimation: Training Log-Linear Models on Unlabeled Data"
From Cohen Courses
Jump to navigationJump to searchLine 9: | Line 9: | ||
== Summary == | == Summary == | ||
− | This is an interesting [[Category::paper]] that presents an unsupervised [[UsesMethod::Contrastive Estimation]] method for [[ | + | This is an interesting [[Category::paper]] that presents an unsupervised [[UsesMethod::Contrastive Estimation]] method for [[AddressesProblem::Conditional Random Fields]] and other [[AddressesProblem::Log-Linear Models]], which can be easily applied to problems like [[AddressesProblem::Part of Speech Tagging]], [[AddressesProblem::Part of Speech Tagging]] and [[AddressesProblem::Part of Speech Tagging]]. When applying this technique to POS tagging, the observed results outperforms EM, and is robust when the dictionary quality is poor. |
== Brief description of the method == | == Brief description of the method == |
Revision as of 21:48, 29 September 2011
Contents
Citation
Smith, Noah A. and Jason Eisner (2005). Contrastive estimation: Training log-linear models on unlabeled data. Proceedings of the 43rd Annual Meeting of the Association for Computational Linguistics (ACL), pages 354-362, Ann Arbor, Michigan, June.
Online version
Summary
This is an interesting paper that presents an unsupervised Contrastive Estimation method for Conditional Random Fields and other Log-Linear Models, which can be easily applied to problems like Part of Speech Tagging, Part of Speech Tagging and Part of Speech Tagging. When applying this technique to POS tagging, the observed results outperforms EM, and is robust when the dictionary quality is poor.