Difference between revisions of "10-601 PAC"
From Cohen Courses
Jump to navigationJump to search| (One intermediate revision by the same user not shown) | |||
| Line 1: | Line 1: | ||
| + | This a lecture used in the [[Syllabus for Machine Learning 10-601B in Spring 2016]] | ||
| + | |||
=== Slides === | === Slides === | ||
| − | |||
* William's lecture: [http://www.cs.cmu.edu/~wcohen/10-601/pac-learning.pdf Slides in pdf], [http://www.cs.cmu.edu/~wcohen/10-601/pac-learning.pptx Slides in Powerpoint], | * William's lecture: [http://www.cs.cmu.edu/~wcohen/10-601/pac-learning.pdf Slides in pdf], [http://www.cs.cmu.edu/~wcohen/10-601/pac-learning.pptx Slides in Powerpoint], | ||
| Line 10: | Line 11: | ||
=== What you should remember === | === What you should remember === | ||
| − | * | + | * Definition of pac-learnability. |
| − | * | + | * Definition of sample complexity vs time complexity |
| − | ** | + | * How sample complexity grows with 1/epsilon, 1/delta, and |H| |
| − | + | ** in the noise-free case. | |
| − | + | ** in the "agnostic" setting, where noise is present and the learner outputs the smallest-error-rate hypothesis. | |
| − | * VC dimension | + | * The definition of VC-dimension and shattering |
| + | * How VC dimension relates to sample complexity | ||
Latest revision as of 15:46, 6 January 2016
This a lecture used in the Syllabus for Machine Learning 10-601B in Spring 2016
Slides
- William's lecture: Slides in pdf, Slides in Powerpoint,
Readings
- Mitchell Chapter 7
What you should remember
- Definition of pac-learnability.
- Definition of sample complexity vs time complexity
- How sample complexity grows with 1/epsilon, 1/delta, and |H|
- in the noise-free case.
- in the "agnostic" setting, where noise is present and the learner outputs the smallest-error-rate hypothesis.
- The definition of VC-dimension and shattering
- How VC dimension relates to sample complexity