Difference between revisions of "Roth and Yih, ICML 2005. Integer Linear Programming Inference for Conditional Random Fields"

From Cohen Courses
Jump to navigationJump to search
Line 21: Line 21:
  
 
where <math> F(y,x) </math> is the global feature vector, <math> \lambda </math> is weight vector, and <math> Z_{\lambda}(x) = \sum(\lambda \cdot F(y,x)) </math> is a normalization factor.
 
where <math> F(y,x) </math> is the global feature vector, <math> \lambda </math> is weight vector, and <math> Z_{\lambda}(x) = \sum(\lambda \cdot F(y,x)) </math> is a normalization factor.
 +
  
 
A CRF is trained by maximizing the conditional log-likelihood of a given training set  
 
A CRF is trained by maximizing the conditional log-likelihood of a given training set  
Line 30: Line 31:
 
L_{\lambda} = \sum_{k} \log (p_{\lambda} (y_k | x_k)) = \sum_{k} [\lambda \cdot F(y_k,x_k) - \log Z_{\lambda}(x_k)]
 
L_{\lambda} = \sum_{k} \log (p_{\lambda} (y_k | x_k)) = \sum_{k} [\lambda \cdot F(y_k,x_k) - \log Z_{\lambda}(x_k)]
 
</math>
 
</math>
 +
 +
 +
=== Inference Using ILP ===
 +
Inference in CRFs is usually performed by the Viterbi algorithm which takes into account the shortest path from start to the end. The weights on the edges of the path determine the global path of the cost. This weight or ''score'' is obtained from a linear combination of feature functions and the weight vector.
 +
 +
This approach is represented in ILP as follows:
 +
 +
Given a directed graph <math> G = (V,E) </math>, two distinct nodes <math> s, t \in V </math>, and a non-negative cost <math> c_{uv} </math> of each edge <math> (u,v) \in E </math>, a minimum ''cost'' path from <math> s </math> to <math> t </math> is desired.
  
 
== Experiments and Results ==
 
== Experiments and Results ==
  
 
== Related Papers ==
 
== Related Papers ==

Revision as of 18:56, 30 October 2011

Citation

Dan Roth and Wen-tau Yih. 2005. Integer Linear Programming Inference for Conditional Random Fields. In Proceedings of the 22^nd International Conference on Machine learning, ICML'05, New York, NY, USA.


Online Version

Online version

Summary

This paper presents an alternative approach to inference in conditional random fields using integer linear programming (ILP). The standard Viterbi algorithm based on dynamic programming, in general, cannot efficiently incorporate non-local and non-sequential constraints over the output sequence. The authors propose an ILP-based method to inference procedure. and extend CRF models to naturally and efficiently support general constraint structures. For sequential constraints, this procedure reduces to simple linear programming as the inference process.

Method

The efficiency of the CRF approach heavily depends on its Markov property – given the observation, the label of a token is assumed to depend only on the labels of its adjacent tokens. It is difficult to explicitly model and exploit more general constraints such as long distance dependencies keeping in mind the Markovian assumption. This is a problem not only for training CRFs but also for incorporating additional constraints during inference. Viterbi algorithm can handle some of these constraints, but cannot handle more generic constraints. This method is able to take into account these constraints more efficiently.

Conditional Random Field

A standard CRF model is represented by:


where is the global feature vector, is weight vector, and is a normalization factor.


A CRF is trained by maximizing the conditional log-likelihood of a given training set  :


Inference Using ILP

Inference in CRFs is usually performed by the Viterbi algorithm which takes into account the shortest path from start to the end. The weights on the edges of the path determine the global path of the cost. This weight or score is obtained from a linear combination of feature functions and the weight vector.

This approach is represented in ILP as follows:

Given a directed graph , two distinct nodes , and a non-negative cost of each edge , a minimum cost path from to is desired.

Experiments and Results

Related Papers