10-601B Generalization and Overfitting: Sample Complexity Results for Supervised Classification

From Cohen Courses
Revision as of 21:46, 21 February 2016 by Tdick (talk | contribs) (Created page with "This a lecture used in the Syllabus for Machine Learning 10-601B in Spring 2016 === Slides === * [http://curtis.ml.cmu.edu/w/courses/images/c/c0/Sample-complexity1.pdf s...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigationJump to search

This a lecture used in the Syllabus for Machine Learning 10-601B in Spring 2016

Slides

Readings

  • Mitchell Chapter 7

What you should remember

  • Distributional Learning Formulation.
  • Definition of sample complexity vs time complexity.
  • How sample complexity grows with 1/epsilon, 1/delta, and |H|...
    • in the noise free case.
    • in the "agnostic" setting, where noise is present and the learner outputs the smallest error-rate hypothesis.