Difference between revisions of "10-601 Ensembles 1"
From Cohen Courses
Jump to navigationJump to searchLine 2: | Line 2: | ||
=== Slides === | === Slides === | ||
+ | * [http://www.cs.cmu.edu/~zivbj/classF14/boosting.pdf Slides in pdf]. | ||
* [http://www.cs.cmu.edu/~wcohen/10-601/ensembles1.ppt Slides in PowerPoint]. | * [http://www.cs.cmu.edu/~wcohen/10-601/ensembles1.ppt Slides in PowerPoint]. |
Latest revision as of 07:53, 22 October 2014
This a lecture used in the Syllabus for Machine Learning 10-601 in Fall 2014
Slides
Readings
- Ensemble Methods in Machine Learning, Tom Dietterich
- A Short Introduction to Boosting, Yoav Freund and Robert Schapire.
Summary
You should know how to implement these ensemble methods, and what their relative advantages and disadvantages are:
- (Ziv - not sure if can do in one lecture if we do boosting) Bagging
- Boosting
- Stacking
- Multilevel Stacking
- (Ziv - not sure if I will do this) The "bucket of models" classifier
- Random forest