10-601B Generalization and Overfitting: Sample Complexity Results for Supervised Classification
From Cohen Courses
Jump to navigationJump to searchThis a lecture used in the Syllabus for Machine Learning 10-601B in Spring 2016
Slides
Readings
- Mitchell Chapter 7
What you should remember
- Distributional Learning Formulation.
- Definition of sample complexity vs time complexity.
- How sample complexity grows with 1/epsilon, 1/delta, and |H|...
- in the noise free case.
- in the "agnostic" setting, where noise is present and the learner outputs the smallest error-rate hypothesis.