Difference between revisions of "Liuliu writeup of Bunescu 2005"

From Cohen Courses
Jump to navigationJump to search
 
m (1 revision)
 
(No difference)

Latest revision as of 10:42, 3 September 2010

This is a review of Bunescu_2005_a_shortest_path_dependency_kernel_for_relation_extraction by user:Liuliu.

This paper is about extracting top-level relations by using a kernel which computes dot product of feature vectors of the shortest dependency path between two entities.

This method represents each shortest dependency path by a feature vector which includes rich syntactic and semantic knowledge. The local dependencies or non-local dependencies are extracted from CCG and CFG parses. To avoid high dimensionality of feature space representation, it uses a kernel function to compute doc product between feature vectors.

Other points:

  • They said that features that are partially related to labels leads to overfitting. By using shortest path, they might avoid overfitting as they only use features of the shortest parth. But maybe using SVM method is also one reason that avoids overfitting, as this kind of max-margin has proven to resist overfitting.
  • I didn't quite understand what "opaque" means. They said that each path feature corresponds to a dimension in the Hilbert space. I didn't quite understand what this means and what's the benefit of it.
    • This wasn't too clear - I think they are claiming that these kernels are more understandable than tree kernels to humans, so it's easier to (say) tune the classifier or do additional feature engineering. Wcohen
  • According to results, the biggest improvement is recall, but why