10-601B Theory 1
From Cohen Courses
Revision as of 09:19, 12 January 2016 by Wcohen (talk | contribs) (Created page with "This a lecture used in the Syllabus for Machine Learning 10-601B in Spring 2016 === Slides === * ... === Readings === * Mitchell Chapter 7 === What you should remembe...")
This a lecture used in the Syllabus for Machine Learning 10-601B in Spring 2016
Slides
- ...
Readings
- Mitchell Chapter 7
What you should remember
- Definition of pac-learnability.
- Definition of sample complexity vs time complexity
- How sample complexity grows with 1/epsilon, 1/delta, and |H|
- in the noise-free case.
- in the "agnostic" setting, where noise is present and the learner outputs the smallest-error-rate hypothesis.
- The definition of VC-dimension and shattering
- How VC dimension relates to sample complexity