Difference between revisions of "10-601 Ensembles 2"

From Cohen Courses
Jump to navigationJump to search
 
(4 intermediate revisions by the same user not shown)
Line 1: Line 1:
 +
This a lecture used in the [[Syllabus for Machine Learning 10-601 in Fall 2014]]
 +
 
=== Slides ===
 
=== Slides ===
  
 
* [http://www.cs.cmu.edu/~wcohen/10-601/ensembles2.ppt Slides in PowerPoint].
 
* [http://www.cs.cmu.edu/~wcohen/10-601/ensembles2.ppt Slides in PowerPoint].
 
* Margin "movie" I showed in class: [http://www.cs.cmu.edu/~wcohen/10-601/margin-movie.pdf Margin movie].
 
* Margin "movie" I showed in class: [http://www.cs.cmu.edu/~wcohen/10-601/margin-movie.pdf Margin movie].
 +
 +
* I also did a demo of [http://www.cs.waikato.ac.nz/ml/weka/ Weka].  There's a [http://prdownloads.sourceforge.net/weka/weka.ppt presentation on the weka GUIs] which covers some of the same material.
  
 
=== Readings ===
 
=== Readings ===
Line 8: Line 12:
 
* [http://dl.acm.org/citation.cfm?id=743935 Ensemble Methods in Machine Learning], Tom Dietterich
 
* [http://dl.acm.org/citation.cfm?id=743935 Ensemble Methods in Machine Learning], Tom Dietterich
 
* [http://cseweb.ucsd.edu/~yfreund/papers/IntroToBoosting.pdf A Short Introduction to Boosting], Yoav Freund and Robert Schapire.
 
* [http://cseweb.ucsd.edu/~yfreund/papers/IntroToBoosting.pdf A Short Introduction to Boosting], Yoav Freund and Robert Schapire.
* Optional: [http://dl.acm.org/citation.cfm?id=279960 Improved boosting algorithms using confidence-rated predictions], Robert Schapire and Yoram Singer.
+
* Optional: [http://dl.acm.org/citation.cfm?id=279960 Improved boosting algorithms using confidence-rated predictions], Robert Schapire and Yoram Singer. (This paper has the analysis that I presented in class.)
  
 
===  Summary  ===
 
===  Summary  ===
  
You should understand the basic intuitions behind the analysis of boosting
+
You should understand the basic intuitions behind the analysis of boosting:
* As reducing an upper bound on error and hence fitting the training data
+
* As reducing an upper bound on error and hence fitting the training data.
* As a coordinate descent optimization of the same upper bound
+
* As a coordinate descent optimization of the same upper bound.
  
You should also be aware that boosting is related to margin classifiers
+
You should also be aware that boosting is related to margin classifiers.

Latest revision as of 17:36, 21 July 2014

This a lecture used in the Syllabus for Machine Learning 10-601 in Fall 2014

Slides

Readings

Summary

You should understand the basic intuitions behind the analysis of boosting:

  • As reducing an upper bound on error and hence fitting the training data.
  • As a coordinate descent optimization of the same upper bound.

You should also be aware that boosting is related to margin classifiers.