Difference between revisions of "10-601 Ensembles 2"

From Cohen Courses
Jump to navigationJump to search
(Created page with "=== Slides === * [http://www.cs.cmu.edu/~wcohen/10-601/ensembles2.ppt Slides in PowerPoint]. * Margin "movie" I showed in class: [http://www.cs.cmu.edu/~wcohen/10-601/margin-...")
 
Line 11: Line 11:
 
===  Summary  ===
 
===  Summary  ===
  
You should know how to implement  these ensemble methods, and what their relative advantages and disadvantages are:
+
You should understand the basic intuitions behind the analysis of boosting
* Bagging
+
* As reducing an upper bound on error and hence fitting the training data
* Boosting
+
* As a coordinate descent optimization of the same upper bound
* Stacking
+
 
* Multilevel Stacking
+
You should also be aware that boosting is related to margin classifiers
* The "bucket of models" classifier
 

Revision as of 16:17, 16 October 2013

Slides

Readings

Summary

You should understand the basic intuitions behind the analysis of boosting

  • As reducing an upper bound on error and hence fitting the training data
  • As a coordinate descent optimization of the same upper bound

You should also be aware that boosting is related to margin classifiers