Bbd writeup of Stacked Sequential Learning W Cohen

From Cohen Courses
Revision as of 10:42, 3 September 2010 by WikiAdmin (talk | contribs) (1 revision)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigationJump to search

This is a review of Cohen_2005_stacked_sequential_learning by user:bbd.

This paper proposes a technique of stacked sequential learning model which is a meta learner for sequential partitioning problems like booting for standard classification. The paper very well analyzes how history feature weights can play important role in getting good test error rate. This paper proposes a method in which while training the actual labels of surrounding tokens can be replaced by inferred labels to create training data like senario. Also K fold cross validation is used to get best possible results on error rate without overfitting. They show better performance on CRF and ME models with stacking.

I have one question about "Min Error" metric. This is computed by finding best probability threshold on test data, so is it not similar to learning on test data?