Difference between revisions of "10-601B Boosting and Other Ensembles"

From Cohen Courses
Jump to navigationJump to search
(Created page with "This a lecture used in the Syllabus for Machine Learning 10-601B in Spring 2016 === Slides === * ... === Readings === * Murphy 16.4, 16.6 Other optional readings: * [h...")
 
(No difference)

Latest revision as of 10:18, 12 January 2016

This a lecture used in the Syllabus for Machine Learning 10-601B in Spring 2016

Slides

  • ...

Readings

  • Murphy 16.4, 16.6

Other optional readings:

Summary

You should understand how these ensemble methods work

  • Bagging
  • Stacking
  • Boosting

You should understand the basic intuitions behind the analysis of boosting:

  • As reducing an upper bound on error and hence fitting the training data.
  • As a coordinate descent optimization of the same upper bound.

You should also be aware that boosting is related to margin classifiers.