Difference between revisions of "Gimpel and Smith, NAACL 2010"

From Cohen Courses
Jump to navigationJump to search
Line 7: Line 7:
  
 
==Summary==
 
==Summary==
 +
The authors want to be able to incorporate a cost function (present in structured SVMs) into standard conditional log-likelihood models. They introduce the softmax-margin objective function that achieves the best of both worlds. Using a NER task, it performs significantly better than a standard conditional loglikelihood model, a max-margin model, and the perceptron, but is indistinguishable from MIRA, risk, and JRB (Jensen risk bound; defined in the paper).
 +
 +
==Brief Description of the Softmax-Margin objective function==
 +
 +
 +
==Experimental Results==
 +
 +
==Related Work==

Revision as of 18:32, 25 September 2011

Softmax-Margin CRFs: Training Log-Linear Models with Cost Functions

Online: [1]

Citation

Kevin Gimpel and Noah A. Smith. Softmax-margin CRFs: Training log-linear models with loss functions. In Proceedings of the Human Language Technologies Conference of the North American Chapter of the Association for Computational Linguistics, pages 733-736, Los Angeles, California, USA, June 2010.

Summary

The authors want to be able to incorporate a cost function (present in structured SVMs) into standard conditional log-likelihood models. They introduce the softmax-margin objective function that achieves the best of both worlds. Using a NER task, it performs significantly better than a standard conditional loglikelihood model, a max-margin model, and the perceptron, but is indistinguishable from MIRA, risk, and JRB (Jensen risk bound; defined in the paper).

Brief Description of the Softmax-Margin objective function

Experimental Results

Related Work