Difference between revisions of "10-601 Ensembles"

From Cohen Courses
Jump to navigationJump to search
 
(2 intermediate revisions by the same user not shown)
Line 1: Line 1:
This a lecture used in the [[Syllabus for Machine Learning 10-601 in Fall 2014]]
+
This a lecture used in the [[Syllabus for Machine Learning 10-601B in Spring 2016]]
  
 
=== Slides ===
 
=== Slides ===
Line 7: Line 7:
 
=== Readings ===
 
=== Readings ===
  
 +
* Murphy 16.4, 16.6
 +
 +
Other optional readings:
 
* [http://dl.acm.org/citation.cfm?id=743935 Ensemble Methods in Machine Learning], Tom Dietterich
 
* [http://dl.acm.org/citation.cfm?id=743935 Ensemble Methods in Machine Learning], Tom Dietterich
 
* [http://cseweb.ucsd.edu/~yfreund/papers/IntroToBoosting.pdf A Short Introduction to Boosting], Yoav Freund and Robert Schapire.
 
* [http://cseweb.ucsd.edu/~yfreund/papers/IntroToBoosting.pdf A Short Introduction to Boosting], Yoav Freund and Robert Schapire.

Latest revision as of 15:45, 6 January 2016

This a lecture used in the Syllabus for Machine Learning 10-601B in Spring 2016

Slides

Readings

  • Murphy 16.4, 16.6

Other optional readings:

Summary

You should understand how these ensemble methods work

  • Bagging
  • Stacking
  • Boosting

You should understand the basic intuitions behind the analysis of boosting:

  • As reducing an upper bound on error and hence fitting the training data.
  • As a coordinate descent optimization of the same upper bound.

You should also be aware that boosting is related to margin classifiers.