10-601B AdaBoost

From Cohen Courses
Revision as of 11:51, 16 February 2016 by Tdick (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigationJump to search

This a lecture used in the Syllabus for Machine Learning 10-601B in Spring 2016

Slides


What you should know

  • The difference between weak and strong learners.
  • The AdaBoost algorithm and intuition for the distribution update step.
  • The bound on training error after T rounds of running AdaBoost.