Difference between revisions of "Lafferty 2001 Conditional Random Fields"

From Cohen Courses
Jump to navigationJump to search
(Created page with '== Citation == John Lafferty, Fernando Pereira, and Andrew McCallum. 2001. Conditional random fields: Probabilistic models for segmenting and labeling sequence data. In ICML. =…')
 
(No difference)

Revision as of 19:13, 26 September 2010

Citation

John Lafferty, Fernando Pereira, and Andrew McCallum. 2001. Conditional random fields: Probabilistic models for segmenting and labeling sequence data. In ICML.

Online version

An online version of this paper is available [1].

Summary

This paper introduces Conditional Random Fields as sequential classification model.

  • They allow the addition of arbitrary features of the observations. These features can be at any level of the observations, and can also be overlapping.
  • They are optimized according to the conditional likelihood of the state sequence given the observation sequence, instead of according to their joint likelihood, as HMMs are. Consequently, they are able to achieve greater accuracy for many NLP sequential labeling tasks.

The key points of the Maximum Entropy Markov Models introduced by the paper are:

  • Instead of observations being dependent on states, it is the other way round. Also, the most accurate way to look at them is as observations being conditioned on the transitions rather than the states themselves.
  • Each such transition is modeled by a Maxent classifier
  • Inference can be efficently done by using a modified Viterbi algorithm, just as in HMMs
  • Training is performed using Generalized Iterative Scaling.

MeMMs were considered state-of-the-art for certain labeling tasks such as NER until the inroduction of Conditional Random Fields.

Related papers