Difference between revisions of "10-601 Ensembles 2"
From Cohen Courses
Jump to navigationJump to search (→Slides) |
|||
Line 1: | Line 1: | ||
+ | This a lecture used in the [[Syllabus for Machine Learning 10-601]] | ||
+ | |||
=== Slides === | === Slides === | ||
Revision as of 10:44, 31 October 2013
This a lecture used in the Syllabus for Machine Learning 10-601
Slides
- Slides in PowerPoint.
- Margin "movie" I showed in class: Margin movie.
- I also did a demo of Weka. There's a presentation on the weka GUIs which covers some of the same material.
Readings
- Ensemble Methods in Machine Learning, Tom Dietterich
- A Short Introduction to Boosting, Yoav Freund and Robert Schapire.
- Optional: Improved boosting algorithms using confidence-rated predictions, Robert Schapire and Yoram Singer. (This paper has the analysis that I presented in class.)
Summary
You should understand the basic intuitions behind the analysis of boosting:
- As reducing an upper bound on error and hence fitting the training data.
- As a coordinate descent optimization of the same upper bound.
You should also be aware that boosting is related to margin classifiers.