Difference between revisions of "Taskar et al. 2004. Max-margin Parsing"
m |
m |
||
Line 14: | Line 14: | ||
for all sentences <math>x_i</math> in the training data, <math>y_i</math> being the parse tree, <math>\mathbf G(x_i)</math> the set of possible parses for <math>x_i</math>. | for all sentences <math>x_i</math> in the training data, <math>y_i</math> being the parse tree, <math>\mathbf G(x_i)</math> the set of possible parses for <math>x_i</math>. | ||
+ | |||
+ | Formulating it as an optimization problem, | ||
+ | |||
+ | <math>\max_\gamma \{\langle\mathbf{w}, \Phi_{i,y_i}-\Phi_{i,y}\rangle \geq \gamma L_{i,y}\} \forall y\in\mathbf{G}(x_i),||\mathbf w||^2 \leq 1</math> | ||
== Related Papers == | == Related Papers == | ||
In [[RelatedPaper::Bartlett et al NIPS 2004]], they used the EG algorithm for large margin structured classification. | In [[RelatedPaper::Bartlett et al NIPS 2004]], they used the EG algorithm for large margin structured classification. |
Revision as of 17:45, 30 October 2011
Max-margin parsing, by Ben Taskar, Taskar, B. and Klein, D. and Collins, M. and Koller, D. and Manning, C.. In Proc. EMNLP, 2004.
This Paper is available online [1].
Summary
This paper presents a novel approach to parsing by maximizing separating margins using SVMs. They show how we can reformulate the parsing problem as a discriminative task, which allow an arbitrary number of features to be used. Also, such a formulation allows them to incorporate a loss function that directly penalizes incorrect parse trees appropriately.
Brief description of the method
Instead of a probabilistic interpretation for parse trees, we seek to find:
for all sentences in the training data, being the parse tree, the set of possible parses for .
Formulating it as an optimization problem,
Related Papers
In Bartlett et al NIPS 2004, they used the EG algorithm for large margin structured classification.