10-601 Ensembles

From Cohen Courses
Revision as of 12:06, 23 October 2014 by Wcohen (talk | contribs) (Created page with "This a lecture used in the Syllabus for Machine Learning 10-601 in Fall 2014 === Slides === * [http://www.cs.cmu.edu/~wcohen/10-601/ensembles.ppt Slides in PowerPoint], ...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigationJump to search

This a lecture used in the Syllabus for Machine Learning 10-601 in Fall 2014

Slides

Readings

Summary

You should understand how these ensemble methods work

  • Bagging
  • Stacking
  • Boosting

You should understand the basic intuitions behind the analysis of boosting:

  • As reducing an upper bound on error and hence fitting the training data.
  • As a coordinate descent optimization of the same upper bound.

You should also be aware that boosting is related to margin classifiers.