Difference between revisions of "Dietterich 2008 gradient tree boosting for training conditional random fields"

From Cohen Courses
Jump to navigationJump to search
Line 1: Line 1:
 +
== Citation ==
 +
 
{{MyCitejournal | coauthors = G. Hao, A. Ashenfelter| date = 2008| first = T. G| journal = Journal of Machine Learning Research| last = Dietterich| pages = 2113-2139| title = Gradient Tree Boosting for Training Conditional Random Fields| url = http://jmlr.csail.mit.edu/papers/volume9/dietterich08a/dietterich08a.pdf| volume = 9 }}
 
{{MyCitejournal | coauthors = G. Hao, A. Ashenfelter| date = 2008| first = T. G| journal = Journal of Machine Learning Research| last = Dietterich| pages = 2113-2139| title = Gradient Tree Boosting for Training Conditional Random Fields| url = http://jmlr.csail.mit.edu/papers/volume9/dietterich08a/dietterich08a.pdf| volume = 9 }}
 +
 +
== Online Version ==
  
 
This [[Category::Paper]] is available online [http://jmlr.csail.mit.edu/papers/volume9/dietterich08a/dietterich08a.pdf].
 
This [[Category::Paper]] is available online [http://jmlr.csail.mit.edu/papers/volume9/dietterich08a/dietterich08a.pdf].
  
Under modification by [[User:dkulkarn]]
+
== Summary ==
 +
 
 +
The paper addresses the problem of combinatorial explosion of parameters of CRFs when new features are introduced. It represents the potential functions as sums of regression trees. The authors claim that adding a regression tree is a big step in the feature space and hence it reduces the number of iterations. This leads to a significant performance improvement.
  
 
== Reviews of this paper ==
 
== Reviews of this paper ==
 
{{#ask: [[reviewed paper::dietterich_2008_gradient_tree_boosting_for_training_conditional_random_fields]] | ?reviewer}}
 
{{#ask: [[reviewed paper::dietterich_2008_gradient_tree_boosting_for_training_conditional_random_fields]] | ?reviewer}}

Revision as of 19:59, 30 September 2011

Citation

Gradient Tree Boosting for Training Conditional Random Fields. By T. G Dietterich, G. Hao, A. Ashenfelter. In Journal of Machine Learning Research, vol. 9 ({{{issue}}}), 2008.

Online Version

This Paper is available online [1].

Summary

The paper addresses the problem of combinatorial explosion of parameters of CRFs when new features are introduced. It represents the potential functions as sums of regression trees. The authors claim that adding a regression tree is a big step in the feature space and hence it reduces the number of iterations. This leads to a significant performance improvement.

Reviews of this paper