Difference between revisions of "Taskar et al. 2004. Max-margin Parsing"
m |
m |
||
Line 13: | Line 13: | ||
<math>y_i=\arg\max_{y\in\mathbf{G}(x_i)} \langle\mathbf{w}, \Phi(x_i,y)\rangle</math> | <math>y_i=\arg\max_{y\in\mathbf{G}(x_i)} \langle\mathbf{w}, \Phi(x_i,y)\rangle</math> | ||
− | for all sentences <math>x_i</math> in the training data, <math>y_i</math> being the parse tree, <math>\mathbf G</math> the set of possible parses for <math>x_i</math> | + | for all sentences <math>x_i</math> in the training data, <math>y_i</math> being the parse tree, <math>\mathbf G(x_i)</math> the set of possible parses for <math>x_i</math>. |
== Related Papers == | == Related Papers == | ||
In [[RelatedPaper::Bartlett et al NIPS 2004]], they used the EG algorithm for large margin structured classification. | In [[RelatedPaper::Bartlett et al NIPS 2004]], they used the EG algorithm for large margin structured classification. |
Revision as of 17:40, 30 October 2011
Max-margin parsing, by Ben Taskar, Taskar, B. and Klein, D. and Collins, M. and Koller, D. and Manning, C.. In Proc. EMNLP, 2004.
This Paper is available online [1].
Summary
This paper presents a novel approach to parsing by maximizing separating margins using SVMs. They show how we can reformulate the parsing problem as a discriminative task, which allow an arbitrary number of features to be used. Also, such a formulation allows them to incorporate a loss function that directly penalizes incorrect parse trees appropriately.
Brief description of the method
Instead of a probabilistic interpretation for parse trees, we seek to find:
for all sentences in the training data, being the parse tree, the set of possible parses for .
Related Papers
In Bartlett et al NIPS 2004, they used the EG algorithm for large margin structured classification.