Difference between revisions of "Roth and Yih, ICML 2005. Integer Linear Programming Inference for Conditional Random Fields"

From Cohen Courses
Jump to navigationJump to search
Line 19: Line 19:
 
</math>
 
</math>
  
where <math> \lambda </math> is weight vector, and <math> Z_{\lambda}(x) = \sum(\lambda \cdot F(y,x)) </math>
+
 
 +
where <math> F(y,x) </math> is the global feature vector, <math> \lambda </math> is weight vector, and <math> Z_{\lambda}(x) = \sum(\lambda \cdot F(y,x)) </math> is a normalization factor.
  
 
== Experiments and Results ==
 
== Experiments and Results ==
  
 
== Related Papers ==
 
== Related Papers ==

Revision as of 18:39, 30 October 2011

Citation

Dan Roth and Wen-tau Yih. 2005. Integer Linear Programming Inference for Conditional Random Fields. In Proceedings of the 22^nd International Conference on Machine learning, ICML'05, New York, NY, USA.


Online Version

Online version

Summary

This paper presents an alternative approach to inference in conditional random fields using integer linear programming (ILP). The standard Viterbi algorithm based on dynamic programming, in general, cannot efficiently incorporate non-local and non-sequential constraints over the output sequence. The authors propose an ILP-based method to inference procedure. and extend CRF models to naturally and efficiently support general constraint structures. For sequential constraints, this procedure reduces to simple linear programming as the inference process.

Method

The efficiency of the CRF approach heavily depends on its Markov property – given the observation, the label of a token is assumed to depend only on the labels of its adjacent tokens. It is difficult to explicitly model and exploit more general constraints such as long distance dependencies keeping in mind the Markovian assumption. This is a problem not only for training CRFs but also for incorporating additional constraints during inference. Viterbi algorithm can handle some of these constraints, but cannot handle more generic constraints. This method is able to take into account these constraints more efficiently.

Conditional Random Field

A standard CRF model is represented by:


where is the global feature vector, is weight vector, and is a normalization factor.

Experiments and Results

Related Papers