Difference between revisions of "10-601 Ensembles"

From Cohen Courses
Jump to navigationJump to search
Line 3: Line 3:
 
=== Slides ===
 
=== Slides ===
 
* Ziv's lecture: [http://www.cs.cmu.edu/~zivbj/classF14/boosting.pdf Slides in pdf].
 
* Ziv's lecture: [http://www.cs.cmu.edu/~zivbj/classF14/boosting.pdf Slides in pdf].
* [http://www.cs.cmu.edu/~wcohen/10-601/ensembles.ppt Slides in PowerPoint], [http://www.cs.cmu.edu/~wcohen/10-601/ensembles.pdf Slides in PDF].
+
* William's [http://www.cs.cmu.edu/~wcohen/10-601/ensembles.ppt Slides in PowerPoint], [http://www.cs.cmu.edu/~wcohen/10-601/ensembles.pdf Slides in PDF].
 
* Margin "movie" I showed in class: [http://www.cs.cmu.edu/~wcohen/10-601/margin-movie.pdf Margin movie].
 
* Margin "movie" I showed in class: [http://www.cs.cmu.edu/~wcohen/10-601/margin-movie.pdf Margin movie].
  

Revision as of 09:33, 27 October 2014

This a lecture used in the Syllabus for Machine Learning 10-601 in Fall 2014

Slides

Readings

Summary

You should understand how these ensemble methods work

  • Bagging
  • Stacking
  • Boosting

You should understand the basic intuitions behind the analysis of boosting:

  • As reducing an upper bound on error and hence fitting the training data.
  • As a coordinate descent optimization of the same upper bound.

You should also be aware that boosting is related to margin classifiers.