10-601B Boosting and Other Ensembles

From Cohen Courses
Revision as of 10:18, 12 January 2016 by Wcohen (talk | contribs) (Created page with "This a lecture used in the Syllabus for Machine Learning 10-601B in Spring 2016 === Slides === * ... === Readings === * Murphy 16.4, 16.6 Other optional readings: * [h...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigationJump to search

This a lecture used in the Syllabus for Machine Learning 10-601B in Spring 2016

Slides

  • ...

Readings

  • Murphy 16.4, 16.6

Other optional readings:

Summary

You should understand how these ensemble methods work

  • Bagging
  • Stacking
  • Boosting

You should understand the basic intuitions behind the analysis of boosting:

  • As reducing an upper bound on error and hence fitting the training data.
  • As a coordinate descent optimization of the same upper bound.

You should also be aware that boosting is related to margin classifiers.