Difference between revisions of "10-601B AdaBoost"
From Cohen Courses
Jump to navigationJump to search (Created page with "This a lecture used in the Syllabus for Machine Learning 10-601B in Spring 2016 === Slides === * [http://curtis.ml.cmu.edu/w/courses/images/f/f2/Boosting-2016.pptx Slides...") |
|||
Line 9: | Line 9: | ||
=== Readings === | === Readings === | ||
+ | --> | ||
=== What you should know === | === What you should know === | ||
− | + | * The difference between weak and strong learners. | |
+ | * The AdaBoost algorithm and intuition for the distribution update step. | ||
+ | * The bound on training error after T rounds of running AdaBoost. |
Latest revision as of 11:51, 16 February 2016
This a lecture used in the Syllabus for Machine Learning 10-601B in Spring 2016
Slides
What you should know
- The difference between weak and strong learners.
- The AdaBoost algorithm and intuition for the distribution update step.
- The bound on training error after T rounds of running AdaBoost.